Register for free to join our community of investors and share your ideas. You will also get access to streaming quotes, interactive charts, trades, portfolio, live options flow and more tools.
Register for free to join our community of investors and share your ideas. You will also get access to streaming quotes, interactive charts, trades, portfolio, live options flow and more tools.
That's why it's important for Intel to have a very deep bench of extremely qualified internal candidates.
BK has been doing his very best to clear that bench of all good
candidates to replace him.
Apparently doing the job well is too hard a way of ensuring job
security, driving away potential successors is easier. Doing it
to white male potential successors even brings brownie points
for his diversity public relations initiative so win-win!
Like I have said before, I am looking for a good INTC exit at
this point.
one more expensive acquisition screwed up ...
In my experience acquisitions only work if the business and
technology of the buyer and buyee are tightly aligned in a
complementary fashion. Divergent acquisitions (i.e. hey that
seems like a cool business, let's buy someone and get into it)
are a recipe for disaster and write-off.
Acquisitions that work on the technical/business level as
described above are still only worthwhile if the price paid
is realistic (i.e. no bidding war for hot, hyped company).
Cost of acquisition + cost of integration < cost of doing the
tech in house? If not, take a pass.
Intel, like Microsoft and many others, seem to have trouble
with both aspects.
So seven dwarves have banded together to try to steal more
crumbs from Snow White.
This changes nothing for me. It is a big noise transient on the supply rails
of the financial system. Only long term losers are gamblers who bet on the
wrong squares on the derivatives and currency speculation roulette wheels.
Too bad, so sad, f***-em for playing the game.
My medium term plans are to reduce my INTC holding at favorable timing,
the same as yesterday.
Would you consider Cray's long string of big box dev contracts national
security? These generally led to commercially available products so I
count it as corporate welfare/industrial policy. I think the Fujitsu gravy
train is the Japanese equivalent.
Who is Fujitsu's customer for these supercomputers?
Probably mostly large Japanese institutions and companies.
Also, if OS/Linux support is a concern, why not just use x86/Xeon and/or Xeon-Phi?
Fujitsu receives government research support to keep rolling out indigenous
processors for HPC. Corporate welfare or national security concerns? That
is always a fuzzy line.
Once you are dealing with clusters and switches the overhead of
signal propagation becomes minor. Bits can move roughly 900 feet
in a microsecond in coax and nearly as fast in fiber.
Fujitsu is going to ARMv8 for their supercomputer:
http://www.theregister.co.uk/2016/06/20/fujitsu_arm_supercomputer/
It seems like governments want to control the entire compute stack for their applications.
Since when is Fujitsu a government?
It is simple:
1) ARM V8 is cleaner than SPARC.
2) Going with ARM also means not dealing with Oracle any more.
3) HPC customers don't want Solaris and Linux support for ARM is
likely better than for SPARC and will only pull further ahead.
PC shipments to rise 20% in 3Q16, say analog IC firms
Last shipments of PCs with OEM Win 7 is Oct 31st.
I am going to buy a new desktop before the deadline.
Xeon already has a huge chunk of the HPC market. These new monster
Intel parts are a means to fight fire with fire and leverage x86 software
compatibility to keep GPGPU from taking a lot of $ share away from x86.
Chuckles is a funny guy. Intel can do no wrong when it competes with
Nvidia but can do no right when it competes against ARM or AMD.
Depends on the price, doesn't it?
Price bombing BD against Xeon didn't work too well for AMD even
without the headache of an incompatible, niche ISA.
I'd rather Intel have at least some additional playing cards in hand, rather than pass up opportunities because Intel decided they weren't worth pursuing.
In my experience there are more worthwhile opportunities than an
established and profitable chip company has design teams to assign
to them. It is a question of weighing the alternatives and choosing
the projects with the best probable return on investment.
Personally I wouldn't look a gift horse in the mouth. This is positive news for Intel.
I hope your optimism is right.
If Intel thought console business was a gift horse why did it let IBM
Micro take Xbox from them? Why did IBM then let AMD take game
consoles from them? My point is there is a lot of chip business out
there that is marginal at best that good companies with alternatives
should avoid. I hope multi-sourced smart phone modem isn't another
console socket win for Intel.
I hope your optimistic read on this proves out.
I am kind of tired of Intel chasing money losing sockets with
too late to market products. Sockets that are low margin under
the best of circumstances.
The question is how badly Intel wanted to get a foot in the mobile door
at Apple.
It makes no sense to lose money with a customer that will drop you in
an instant to save a penny.
Windows 10. The "gift" that keeps on giving.
Climate models show
I design chips by doing circuit simulations using extremely complex
models of transistor operation. These transistor models have close
to 100 parameters most of which are curve fit from the extensive
measurement of countless test chips and process monitor inserts.
Circuit design itself must account for both systematic processing
variation and also random device to device variation using corner
simulations across hundreds of parameter combinations as well as
Monte Carlo simulations across hundreds of trials.
For all this thorough and detailed computer simulation every new
chip looked at in the lab has some aspects of operation that doesn't
closely match computer simulation. Sometimes this requires a circuit
change, sometimes it can be accommodated by built-in trim.
This shows the practical limitations of computer modelling even in
an extremely limited and well understood regime of physical laws.
In contrast climate modelling is an extremely crude approximation of
many poorly understood physical, chemical, and biological processes
whose interactions are largely guesswork and hypothesis.
The potential effects of climate change are a serious business but
expecting reasonable predictive answers from computer models is a
tiny step removed from inspecting the entrails of a sacrificial goat.
Intel's HPC director and evangelist James Reinders is leaving Intel after 27 years
http://www.theregister.co.uk/2016/06/07/from_iwarp_to_knights_landing_james_reinders_leaves_intel/
Intel's HPC director and evangelist James Reinders is leaving Intel after 27 years - or as he puts it, 10,001 days - accepting the firm's offer of early retirement for long-standing employees.
Reinders describes how he joined Intel in 1989 to work on a VLIW (Very Long Instruction Word) processor called iWarp, designed to be connected into a cluster. It was the early days of a search for higher computing performance via parallelism rather than faster clock rates.
According to Reinders, Intel's work on parallelism eased back when clock rates surged again with the 486 and Pentium processors, but that was only temporary. Reinders became a tireless champion for concurrency as well as for Intel's compilers, libraries and other software development tools.
The outflow of talent from Intel under BK's watch is worrying to say the least.
I knew Lisa Su 20 years ago and knew then she would be a high tech CEO one day.......and SUCCEED.
Succed? LOL, what would AMD look like if she had failed?
Servers sales slightly down in Q1
http://www.idc.com/getdoc.jsp?containerId=prUS41424716
According to the International Data Corporation (IDC) Worldwide Quarterly Server Tracker, vendor revenue in the worldwide server market decreased 3.6% year over year to $12.4 billion in the first quarter of 2016 (1Q16). This ended a seven quarter streak of year-over-year revenue growth as server market demand slowed due to a pause in hyperscale server deployments as well as a clear end to the enterprise refresh cycle. Worldwide server shipments decreased 3.0% to 2.2 million units in 1Q16 when compared with the same year-ago period.
Demand for x86 servers improved in 1Q16 with revenues increasing 2.6% year over year in the quarter to $10.6 billion worldwide while unit shipments declined 2.9% to 2.2 million servers. HPE led the market with 29.7% revenue share based on 5.5% revenue growth over 1Q15. Dell retained second place, securing 21.5% revenue share following a 1.8% year-over-year revenue decline.
Non-x86 servers experienced a revenue decline of 28.7% year over year to $1.8 billion, representing 14.7% of quarterly server revenue. IBM leads the segment with 62.7% revenue share despite a 32.9% revenue decline when compared with the first quarter of 2015. IDC also continued to track falling revenue from ARM server sales in 1Q16, with the HP Moonshot system deployments representing the largest single component.
ARM server sales also dropped in 4Q15
http://www.idc.com/getdoc.jsp?containerId=prUS41076116
ARM server sales fell in 4Q15 compared to the same time in 2014
So ARM server sales fell the last two quarters in a row, LOL. :-P
Major WTF here..
No kidding.
Associating with the Donald seems far off script from BK's Alan Alda/
feminist/diversity is wonderful/politically correct bat signal he has
been beaming since taking over.
ARM in servers - poor performance, high power, poor availability,
immature software support, lack of applications. Obviously ready
to conquer the market.
I'll wait for the savings from mobile to lead to a higher share price and see what happens in the memory business. From the statements above, the modem deal with Apple seems to be established (my impression). That may lead to a higher share price as well. Once that's the case, I'll at least trim my position.
That sounds like a reasonable strategy. I am also looking for a decent
exit point to significantly reduce my INTC holdings. My misgivings about
BK's poor leadership seems to grow with every passing month.
Skylake is a good product. IIRC you're coming off of a SNB, so it'll be a solid improvement for you.
I have a Haswell 4790 (not the k version).
The 6700(k) would be a marginal upgrade but I would like to have two recent
Win7 boxes in hand because any new laptop would be either Mac or Linux.
Thanks for the suggestions. I have seen reports that Win7 will never be
updated to properly support Kaby Lake so there may not be any incentive
to delay hardware purchase waiting for it anyway.
Kaby Lake will be in systems by the end of the year, and leaks show the desktop "K" part will arrive in Q4 2016
The last date for OEM sales of Win7 is the end of October.
I guess I'll have to go with Skylake for my "just under the wire"
desktop upgrade.
I am also kind of surprised they haven't leaked cherry picked Zen
benchmark results yet to whip up the hype and hold off remaining
faithful from buying Intel. It's not like they don't have their entire
future riding on this core or anything.
According to IDC Intel ships 99.2% of all server processors.
http://www.theregister.co.uk/2016/06/01/server_cpu_shipments_revenue_growth/
The numbers are from the tech market research firm IDC and confirm the dominance of Intel in server-class microprocessors, with AMD nowhere to be seen. Intel accounted for 93 per cent unit share in 2010 and 99.2 per cent in the last quarter of 2015.
On the bright side for AMD and ARM, they have no place to go but up in
servers.
I never suggesting putting any poster on ignore.
Simply just don't respond to obvious trolling even if addressed to you.
Even blind psychotic obsessive compulsive squirrels stumble over an
an interesting acorn once in a while.
Server customers buy this "outdated architecture," no problem smile
Server parts take much longer to qualify internally and at OEMs so there
is a big lag in uarch take-up between client and server silicon, especially
for the 4+ socket devices.
Sky Lake based client silicon has been shipping since last summer. Why
step backwards with a 140W part? I look forward to seeing what tweaks
are in the Kaby Lake uarch.
The $1723 part is the triple cheeseburger to allow more folks to
justify to themselves buying the $1089 SKU as being reasonable.
As an Intel shareholder I hope they sell a lot of them, even the
$1723 part. As a desktop PC enthusiast I wonder who in their right
mind would pay so damn much for an outdated microarchitecture in
a 140W device no matter how many cores it has. In fact I wouldn't
buy a 140W desktop MPU for any price no matter how many cores.
If I wanted more than a quad core mainstream then I would look
at a Xeon based workstation. At least then you get ECC memory.
He just one of the forum perma-trolls. Ignore or smack down, the
choice is yours but the latter is generally not worth the bother.
Yeah but something changed.
What if Charlie turns out to be wrong, can you get a refund?
Charlie is NEVER wrong (just ask him).
Of course sometimes things change...
Why didn't Intel develop something like this years ago -
just big, fat, and dumb?
What's the market for this What is the return on investment?
Back in the day TI invested millions in a LISP chip but Intel didn't.
Back in the day several Japanese companies invested millions in
making Prolog chips but Intel didn't.
Was Intel being "big, fat, and dumb"? Hindsight shows Intel was 100%
correct in not wasting money on those passing fads.
So is Intel being dumb here? I'll let you know definitively in ten years
but if I had to bet I would strongly say no. Google has a lot of money
and their tech guys have a big sandbox to play in. End of story.
BTW, how much money has IBM made moving individual Xenon atoms
around to spell corporate logos? Should Intel have gotten in on that too?
Didn't JSIII first start yapping about focus many, many years ago?
Silicon talks, BS walks.
The GTX 970 in my i7 4790 box drives my 2560x1440 display just fine.
Who cares? Games are developed with an eye to general commonality
across consoles and PCs.
A PC with a decent 2 year old graphics card is far better than a console.
Getting one GTX 1080 is overkill today and future proof for many, many
years. Having 2 let alone 3 or 4 of these in a PC is just plain nuts unless
you want to heat your house this way.
Sorry skippy but the Arab spring and resulting messes in Libya, Egypt,
and Syria all happened under the mismanagement of Barry and Hillary.