Why Intel Better Not Be Economic Bellwether This Time Around

Tyler Durden's picture

For those who missed it earlier, Intel reported results that were just slightly better than expected, and yet the stock tumbled over 3% after hours. The reason is because despite a weak quarter which had been pre-guided down by the sellside community every so effectively, the semiconductor manufacturer saw even more weakness in Q4. Those who wish to read the details can do so here. For everyone else who is more of a visual learning bent, we present the following chart which shows the year-over-year change in Intel revenue, which shows that for the first time in 12 quarters, INTC reported a decline in annual revenue. Furthermore, there is virtually no question that Q4 will also see a revenue decline: the only question is whether it will be greater than Q3's 5.5% Y/Y drop.

Why is this notable? Because the last time Intel posted two, or even one, consecutive declines in Y/Y revenue, the recession was found to have already been in place for nearly a year, starting in December 2007.

Intel is held by many as the canarie in the coalmine of the ever so critical tech sector, which is also a proxy for high-margin electronics products, and from there marginal demand in the economy as a whole. One can hope that the central planners have this latest recessionary confirmation under control (and with China sternly refusing to join the easing party the conclusion so far is they absolutely do not), or else the market will find itself at such a disconnect between the synthetic, correlation driven and implied value from the ES and SPY relative to cash flow intrinsics that not even QE?+1 will be able to save the policy vehicle that is the market, let alone the economy.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
Dr. Engali's picture

Bullish.......Just buy the fuckin chip.

Publicus's picture

Startup sales appears to be slowing down now. Carnage is near.

Ahmeexnal's picture

Strike One: cartel bullion banks were $10 away from commercial failure.


The battle rages on.

Jack Sheet's picture

The "London Trader" would appear to be North American judging by the style of writing. ("We had already gotten to the point..."). Unless it is Eric King's transcription.

sotto's picture

Oh yeah, and QE3 will only make matters worse.  Perhaps they will stall the collapse for a while with it but the pain will only be that much greater later on when consumers are hit with reality and can't borrow and spend any more:


Gringo Viejo's picture

OFF TOPIC QUESTION TO THE BOARD: Is DRUDGE down as of 9:45 pm? Can't connect................

Dr. Sandi's picture

Drudge is down 3% based on decreased Q3 earnings.

GeezerGeek's picture

Intel needs an immediate release of a CPU cheaper and better than their latest Ivy Bridge processors. Something to challenge the Raspberry Pi pricewise.

Meanwhile, the major political parties fight over which will get to sell the American sheeple the latest economic process, code-named Brooklyn Bridge.

chump666's picture

Nasdaq sinking down -2.42 on one mth.  Asia sinks first, should take out US indexes.  A 'banking' rally ala Obama's spend up...is frightening.


fonzannoon's picture

Chump I hope that you and yen and slaughterer and the rest of the good guys on here that trade this mess have your backup plans made. When the house is finally forced to pay up big they are just going to fold up shop.

chump666's picture

Yeah I think so.  Don't worry, I am trading this too the end.  My fear is the derivative meltdown and ETF/futures all become illiquid very quickly.  Accounts shutting down, trades not paying out etc.  I still think the TVIX debacle early this year was a loverly warning sign when hedged/leveraged/derivative players (connected to funds) suddenly become illiquid.

I also think FX is worry with volatility/volumes compressed.  All and all we should enter a very volatile time where I and I suspect the others thrive - as you trade off bigger players shifting positions quickly.  But after that...who knows?  A year, or more, this current market will cease to exist. 

chump666's picture

Meantime China is hedging against rampant inflation, like Brazil did in the early 90's i.e leaking cash and buying USDs, o/s investments to store value.


fonzannoon's picture

they should just float the remnimbi. their citizens will love the deflation that ensues.

Awakened Sheeple's picture

Fortunes are made during times like these. It's too good of an opportunity to sit on PMs and wait it out with ALL of your savings. Though I admit, my trading funds consist of my "play money"

flyonmywall's picture

Hungry people do not buy computers. People looking for a job do not buy new computers.

Around where I live, there is a place that sells used computers for about $100 bones, and another $30 for a monitor. Used computers do not generate sales for Intel.

This really should not be all that hard to figure out for an analyst with half a brain to figure out. Note that this does not include most analysts.

fnord88's picture

exactly. It used to be that you had to buy a new computer every 12-24 months because they were so fucking slow software outpaced hardware very quickly. I used to be one of those that went to the shop, bought $1500 parts every 18 months and build myself a new fast PC. These days, if you dont play games or do 3d design work, a 4 year old PC is still fine. Bought an SSD for less than $1 a gig last month, and computer with 3 year old chip is now super fast again. I dont need new chip, new motherboard, new graphics card, new case, new ram etc etc. My mum got my old core2 duo 3 years ago, and it still is more than enough for her basic web browsing. My sandy bridge laptop is fine 12 months in, will not even contemplate replacing for another few years. Suppose i could join the que and get an iPad, but have yet to work out what the fuck i need one for. Having to spend less should be good for the economy, as it frees up cash for me to spend on other, more important things. But in this bizaro world, saving money is bad for the economy.

pavman's picture

You obviously aren't getting in on the new currency, bitcoin.  Come on man, everyone can be their own central bank if they go out and buy 4 screaming ATI cards, a cheap cpu and start mining their own hashes.  You really want to be left behind when digital currency comes of age and you don't have two bitstrings to rub together?

In fact, the biggest problem w/ bitcoin is the worry that it could end up suffering a deflationary spiral! But by design it can't happen.  Yep, you read that.  Pretty awesome, IMHO.

A distributed, decentralized, anonymous internet currency.  Awesomeness in action.

fnord88's picture

Except it's backed by nothing. Nothing at all. No government. No military. No tradition. Plus the security sucks, people have their bit coins stolen all the time. Like the idea of it, but the fact that it enables people to avoid tax means the powers that be will crush it.

NidStyles's picture

Decentralized it is not.

El Hosel's picture

...  Currency markets are gapping around, dollar looks like the loser for now. Anyboby have a clue what "the market" is seeing? LOL-  WTF

seek's picture

Both Intel and AMD pre-announced revenues back in early September with ~ -10% y-o-y growth. The mature markets are already dead, and the emerging markets tanked in Q3, China leading the way.

Some of the analysts have been trying to attribute it to tablets (iPad etc.) which would be fine, except there's no way Apple is going to be posting gains at a level high enough to account for this (and indeed, with the Apple sell-off this week, one has to wonder if the bad news hasn't leaked and has something to do with this.)

It's pretty obvious based on all but a handful of indicators we're looking at a Q4 2008 level recession that's being papered over pre-election. A lot of companies are juicing things by stuffing or playing games with product mix, all of which will come home to roost Q4, if not in this quarter's results.

2013 ain't going to be a happy year.

GeezerGeek's picture

2013 ain't going to be a happy year. If Romney wins in November the numbers will stay good through January. Then the February numbers will be revised downward to match reality. Dems will scream about what a piss-poor job Romney is doing. If Obama wins, the numbers will stay good until the rioting begins, at which time Obama will claim to have been too distracted by the election to fix things, and besides it was all Bush's fault anyway.

Neither has a plan that can handle reality. The American people will pay the price.

nmewn's picture

Whaaahhh...wheres my debt binky?

Come on Benny...I'm stackin, er, countin on you ;-)

monopoly's picture

And as expected, futures are leaning to the bullish side. I am glad I stopped shorting a long, long time ago. The most truth ever gets is 4 or 5 days and then right back up. On the bright side, this is my first "Currency War". What a way to invest. I forgot what a "free market" is. 

Well, will watch tonights carnival, Ehhh, debate. Got the popcorn, diet coke and the volume on. Never do that during market hours. Just more dribble from both of them. Oh well.

Quinvarius's picture

QE3 + Win8 upgrade cycle + 4% dividend + down from 30 on the warnings makes this a hard thing to short.  I am sure the usual players will have a go at it. 

Tijuana Donkey Show's picture

Windows 8=floating turd. Why do people need to upgrade? For the Surface? It's like the Zune redux. It's the price of an Ipad, but not as good, and now your tablet can catch a virus or spyware driveby on the go! think Intel is slowing leaking to ARM and other processors for mobile. Web based apps have broken the PC cycle, all you need is a Linux distro on an older PC, and you can rock out on the web. 

Quinvarius's picture

Are you going to buy old stuff now, or wait a few months for something new? Upgrade cycle. That is just the way people think.

jmc8888's picture

There is no Win8 upgrade cycle.  We've all seen it before, Win 95 was good, Win98 sucked.  Win 98SE was better by default.  Windows ME sucked.  Win XP rocked.  Win Vista sucked.  Win 7 rocks.  The thing is, many are still using Win XP.  It's a good OS.  Those that wanted to upgrade went to Win 7, and for Windows OS'es, it is by far the best.  Many people hate MS windows of any variety, which is very understandable.  But those who can tolerate or like Windows are still on the XP/7 bandwagon.  Those that upgrade from XP will still choose 7.  The reason? All indications are Win 8 blows hard.

One of the aspects of windows that many people liked was that it was an open system for software devs.  Win 8 isn't nearly as much.  You already have quite a few major devs stating they hate Win 8 for this reason, and did so already weeks ago before MS even set a launch date. This is pretty shocking as usually when it turns out to be a bum MS OS, influential people in the sector aren't on record until AFTER it comes out.

Others state it is basically a smartphone interface, but on your computer.  It was made to be a mobile/desktop OS.  There's synergy for you.  The problem is, people don't want to use their desktop like it's a smart phone.  Win 8 is probably going to bomb as bad as Windows ME, maybe worse.  It simply doesn't make sense to use an interface that is designed for touch screen and limited space to be used on much bigger (and in some cases HDTV's) that use a mouse while limiting functionality at the fingertip.  Plus there is no reason to upgrade from Win 7.  Absolutely none. 

There is still little to upgrade from Win XP, especially if they have the 64 bit version. Of course gamers want to play DX11 games, so Win 7 covers that.  But overall there is no reason to upgrade because of Win 8.  If anything it's a much tougher sell than every other MS OS release thus far. 

Not only is there Win 7 which means there's a scenario where there is a good OS from MS already out competing that is fully functional by far, but with XP still in the mix, there are TWO OS'es that more or less are.  That's unprecedented.  Win 8 won't even get people to leave from Vista if they still use it.

Worse yet it's looking like many people are going to hate Win 8, at least for their desktops.  For mobile devices it might be pretty good, but for desktops, which is mostly what Intel chips are (desktops/laptops) it will not drive Intel sales.  There's a lot of talk about people that buy a Win 8 computer (just because they are upgrading) that they will downgrade their OS to Win 7, and vendors might allow them to even choose so they don't have to do it manually.  It's been done before with Vista to XP.  So it's looking like this is going to ramp up early and often, with an even bigger reason to downgrade than the last time it was allowed from the retail level.

Then you look at Intel iteself.  There's no reason to buy a new chip if you have bought one of the iX series.  As one person stated earllier in the thread, even a core2duo is holding up for the lay user.  He mentioned having an ivy bridge, but even the first iX series (bloomfield) is holding up well. 

If you bought an i7 920 for under 300 bucks in Nov 2008 or the D0 stepping in 2009 (same chip just a re-spin), you have a chip that can still pass every test and will be able to for years.  i7 was quite a leap in performance from the core2duo chip, and even now almost four years later, if an iX series is coupled with a recent vid card, anyone can still easily achieve top DX11 graphics fidelity at 60 fps for any game.   If someone wants a better gaming rig, and have an i series from intel chip, all they need to do is upgrade their vid card.  Thus intel did such a good job with iX series, the upgrade cycle has been significantly lengthened.  Four years after introduction and anyone using the lowest i7 chip that cost 75 percent less than the top end i7 chip can still upgrade a non-intel part and skip the rest of the upgrade cycle.  Remember that chip you could overclock the 250-300 dollar chip and get 10-15 percent better performance than the 1000 dollar chip at stock speeds.  In other words, short and sweet, for most people instead of a full upgrade cycle, now all that is needed is to pick up a new vid card, and maybe a new hard drive which of course can happen at any time irrespective of anything else.

But it gets worse for intel.  We all know that multicores allowed for multitasking so the browser and OS don't get in each other's way, and certain thresholds for the lay internet user have been reached in terms of resources needed to surf.  The need to upgrade to continue with a pleasurable (or simply not annoying) experience for the lay user to surf the web or check their email isn't there.   It'll be a while before they feel a need to do so and that is a long term proposition.  Once they upgrade again a few years from now, who knows how long it'll be before they need another one.  Probably until breakdown or too lazy to do a fresh install of Windows or another OS.  So that sector is slowing down.  Same thing for general office computing.  Microsoft Office and the like are not the resource hogs they used to be, they barely register usage on today's computers.  So the updgrade cycle for businesses isn't there.  Plus one of the dirty secrets has always been that a fresh install of Windows will speed up many computers bogged down from years of installing and uninstalling of software.  So whether it's the lay person, or the usual office, there is no reason to upgrade if it's happened in the last 4-6 years and many of them won't need to for at least another 4-6 years.

Then you have the aspect of alot of the power users.  Whether gaming or graphics design, alot of this is based on the graphics card.  For gaming, as time has gone on, the CPU is becoming less and less important to gaming.  So again, if one needs better gaming or better graphic design, a graphics card upgrade will solve the problem.  PCI express 3.0 has been out for a while now and it isn't even needed yet.  PCI express 2.0 is still enough.

It seems the only ones that need to constantly upgrade both computers and pipeline access are the HFT's.  Somehow I think that's already reached saturation, so probably isn't going to help intel much there either.

Of course Intel is trying to eventually meld the two, and as you can see, it is important for them to do so. 

So even besides the economic situation, Intel is facing some terrible headwinds.  While AMD is having troubles, at least they have ATI.

Let's recap why Intel will most likely struggle going forward.

No Win 8 upgrade cycle

If you're a gamer, a casual lay internet/email user, or average office user there is absolutely no reason to buy a new computer with an Intel chip if you have already bought one for the last four years.  Up to six years if your a casual/office user.   Worse yet, nothing will for probably another 4-6 years.  GPUS not CPUS are all the upgrade only some people need.  Plus the great depression 2.0. 

If anything people are still uptaking this iX series, but we're on the back end.  Once people have migrated to it or a similar AMD CPU, the overall need to upgrade will drop even further.  There's a pretty good chance that if you bought an i7 in 2008, with a GPU upgrade in 2015 you'll still be able to game at ultra settings/1080/60 fps, and that is supposed to represent the enthusiast or easiest to sell to customer sector.

Finally to end, when it comes to gaming, there are generally certain ramp ups.  As people have noticed the video game console cycle comprises the 360/PS3 which are from 2005/2006 and the upcoming Wii U.   Even 'next gen' consoles won't be 'next gen', because there really isn't a next gen coming for some time.  Even top end PC graphics card 3 years from now will be hard pressed to be called 'next gen'.  The Wii U, the PS4/Xbox 720 won't be 'next gen', nor have as much raw power as those PC GPU's.  Basically if you want to know what will be called 'next gen' and be the standard for the next few years it's going to be Frostbyte 2 engine, Cryengine, and Unreal 4 engine.  These will be the main engines of what is termed 'next gen', even though true next gen will either be beyond this or heavily upgraded later iterations of these engines.  Why do I end my post with this info? Because the i7 Intel 2008 CPU with an Nvidia GTX 670 from a few months ago, can tear through these engines.   Whatever Nvidia comes up with in it's 7XX or 8XX series will easily be paired with an Intel i7 chip bought possibly as early as 2008 and be able to competently serve the enthusiast sector until late in this decade.  Food for thought when you realise the 2008 CPU can already run what will be the gaming standard of the future during the 2013-2018 timeframe, and perhaps even to 2020.  I'm sure many will upgrade before 2020, especially enthusiasts, but as you can see, the 'need' to do so, simply isn't there.  Not with such great CPU's out there, not with lowered and longer gaming cycles, coupled with an ability to upgrade the GPU which provides most of the gaming benefits.  Of course those that upgraded even after the original i7 having even more under the hood.

Quinvarius's picture

LOL.  The issue is you wait until the product is released before making a purchase.  It is not that people always run out and upgrade.  Even I am waiting for a MSFT Win 8 tablet.  It has more to do with human nature than technology.

Yes.  Win 8 upgrade cycle will happen.

NidStyles's picture

Win 98 sucked now? WTF is with all of you Win 95 turd suckers, did any of you even use Win 95, it was a steaming pile of turds.

Matt's picture

Intel is going to be going strong into the mobile market, but if over 6 billion people already have cellphones, it is a pretty saturated market. Lots of them only talk and text, so there may be room for smartphones to grow out a fair bit. 

As far as desktop / laptop processors go, steam hardware & software survey shows 48.8% of users still running dual cores; there is a fair chunk of people who may upgrade to newest-generation hardware.

The video game world has become very console-centric; depending on what the next generation of console hardware has, who can say what new innovations that require more processing power may develop?

For casual end-users, etc, the mobile chips are likely going to be the big market, with much lower power consumption being one of the big drivers. Intel will be producing 14nm chips while ARM, AMD, etc will still be going on 28nm. The new foundry will have much lower production costs, so they can concievably improve margins while lowering prices.

Oliver Jones's picture

Intel has also pissed on their own market by making sure that only the mainstream gets to see new technologies first, while the enthusiast sector has to wait another 18 months for the same architecture to make it to market! To be honest, it is difficult to be enthusiastic about buying a Xeon E5 Sandy Bridge machine today, when Xeon E3 Haswells will be hitting the market in the first half of next year. By doing this, they have practically killed any kind of enthusiasm for their products, because the early adopters - who usually tend to buy high-end stuff - aren't early adopters anymore: They've been forced to wait until the technology is mainstream.

Intel also screwed up the Sandy Bridge chipset, and as they only update the platform for every new microarchitecture change (leaving it identical for the die shrink/process change), Ivy Bridge-based Xeon E5s will be a write-off before they have even launched: No USB3, only partial SATA3 support, and still stuck on PCI Express v2. Anyone seriously looking at a dual-processor platform is going to be waiting until Q4 2014 at the very earliest, and they will be keeping their money in their pocket until then.

buzzsaw99's picture

bennie can fix this

q99x2's picture

Where's Arthur Anderson when you need them?

Obama drove us into recession.

Gary Johnson for President.

Feinstein for Zimbabwee senate.

Schumer for incarceration.

pragmatic hobo's picture

uh, ... in all fairness obama did not drive into recession. as I recall the second coming of great depression was well underway when obama took office. He did make things worse by appointing geithner as treasury sec and reappointing bernanke as head honcho of banking mob, then proceeding to destroy the middle class by hocking the country to bailout the banking cartel, but obama did not drive us into recession.

Dr. Sandi's picture

Obama is merely the relief driver on the bus to financial Hell.

GeezerGeek's picture

Obama was one of those Democratic senators who voted for the spending and regulatory excesses that got us into the current mess. Remember, Dems had complete control of Congress starting in January 2007. Not to ignore the negative contributions of the not-at-all-conservative Bush, but Obama made his bed as a senator, got elected president, and has to sleep in it. Serves him right.

orangegeek's picture

Another exercise in accounting - earnings up, revenue down - Q3 storyline.  Q2 storyline repackaged for Q3.




And what doesn't get a pounding in Q3 will get a pounding by Q4.

bugs_'s picture

divvy cut

possible acquisition of a larger "arm" chip manufacturer which will be a public concession of the end of x86 dominance

a return of architecture wars?

pragmatic hobo's picture

in the coming era of virtual machines, the architecture of chip will become a moot point. That in fact is the greatest threat to viability of intel.

Dr. Sandi's picture

Intel is desperately trying to get into mobile computing. They just haven't figured out how to actually make a product the rest of the industry wants to call their own.

TruthInSunshine's picture

Intel's failure to get an ogilopistic foot in the door of mobile computing and build a moat to supress, squash & otherwise drown potential competitors (that are now well-established and building highly advanced, cost-effective, and even more efficient processors than Intel) will turn out to be Intel's biggest strategic blunder in its history.

The mobile processor market is the Wild West, with a few notable cowboys shining prominently early, and Intel ain't one of them.

GeezerGeek's picture

Concurring, I think everyone should consider how Intel tried to stamp out AMD back in the pre-Pentium days. AMD would have died had they not won a xourt case that allowed them to build processors based on the x86 instruction set indefinitely, and Intel would have been left with essentially a monopoly. None of the mobile chip makers want to be at Intel's mercy going forward.

Conman's picture

Honey badger market don't care.

ZFiNX's picture

Diminishing marginal demand for faster processing chips, that's all it is. Eventually computers were going to get fast enough where we didn't care if it was 10GHZ or 20GHZ, it plays our Youtube all the same. So now they've probably expended too much capital into researching things we don't really need. I don't expect their bottom line to diminish, but they need to spend a lot less money on R & D at this point.

EDIT: With the exception of mobile.

GeezerGeek's picture

It's not so much the speed of a chip as it is the throughput, which is why AMD clobbered Intel when AMD led the way into multi-core chips. CPSs are still capped at around 4GHz speed for the general user, but the multi-core and multi-threading architectures are allowing ever greater processing power at diminishing prices. Software is, unfortunately, still behind hardware in this area. Applications - and even operating systems - have been slow to incorporate all the architectural advances. Mobile chips seem to have led the way into the System-on-a-Chip area, which AMD and Intel have adapted to desktop and notebook processors.


ZFiNX's picture

You said a lot of interesting things in your post. My query with regards to the retarded exploitation of the technology (i.e. the lack of good software) is at what point does it become unnecessary to produce "faster" or more "throughput" chips? Surely it cannot go ad infinitum.

Matt's picture

Die shrinks will end soon, since you cannot get smaller than 1 atom across. 3D-Transistors will be the next big jump.

As far as saying is there a limit to how much speed can be used productively, we are nowhere near a real Turing machine yet. Just wait till you have a Siri-esque assistant responding in real-time, able to handle your accounting, scheduling, etc like a real live secretary. Cars that can drive themselves, much more safely than the average human. 

On the massive end of things, an end to improving performance would certainly hurt big innovations. sequencing DNA, processing the results from CERN, etc all get exponentially faster with performance leaps. HD MRI and a complete map of the human brain will allow big improvements in treating nuerological disorders.

I guess the answer to your question, "at what point does it become unnecassary to improve performance", is when there is nothing new to innovate on, and all requested tasks are completed as fast as the human operater can make the requests.

Jack Sheet's picture

So you believe the reported revenue figures? Greetings from Enron.