Guest Post: Apple Succumbs To Battery Chemistry?

Tyler Durden's picture

Submitted by Sabregold1999

Apple Succumbs To Battery Chemistry

In light of the news that Apple is issuing a dividend with the stock flirting with all-time highs, it might be a good time to assess where Apple is with its two products, the Iphone and the Ipad. There is no arguing with the success of these products, but that is not the real story that needs addressing. The real story for Apple is battery chemistry and much like the automakers it fails.

Apple, like a lot of gadget makers needs new iterations to generate a buzz. If the newest product lacks significant improvements the growth model suffers. For Apple any and all great innovations on the hardware side will be limited, simply because battery chemistry, unlike Moore’s law, moves at a snail’s pace. Here is some evidence to show you the proof.

Let’s examine the latest offering from the IPad3. Apple was able to increase the size of the battery in its device by about 70%. It did this by engineering a more efficient internal set-up. It did NOT increase the energy density of the individual lithium ion cells. Why was all of this done? They did it for two reasons.

4G eats batteries and that new screen hogs power. So, without engineering a better set-up, the IPad 3 would have been much larger, instead of only slightly larger.

Why is this important? If you examine a laptop, there’s a reason why the battery is as large as it is relative to the device. The laptop uses significant power and lithium ion batteries are only capable of packing so much energy density into a defined space. Design engineers at Apple and Samsung know perfectly well, that asking your phone or tablet to replace that of your laptop comes with one big problem that no one has solved; namely, battery chemistry.

For a company like Apple, it truly is constrained on what it can do moving forward with the Ipad and the Iphone franchises. Computing power comes at a price in small packages and with the latest revelation that energy densities did not improve, an investor might conclude that the next iterations will have to contain extraordinary software rather than hardware developments.

Oh and it is for this reason that this author believes Apple is tackling the TV market next. Stay tuned.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
Hard1's picture

Buy this HOT stock

hedgeless_horseman's picture

Post moved to iFry at the request of Apple Marketing Dept.

Triggernometry's picture

ipad3 has the same cpu as ipad2; the difference in power consumption comes from a display which has one million more pixels than a 1080p television, and a quad-core gpu to drive juiciness into said display.  Therefore I disagree that the only improvements will come from software, as the processor can be improved in the next iteration; not to say that software won't need revising.  Apple can squeeze a bit more, but is ultimately contrained by power density and intensity.

As a disclaimer- I do not own any Apple products or stock.

UnderDeGun's picture

For some reason i cant see your point as it relates to the focus of the article. You can tweak the engine of a car but you aint gonna get many more miles without a bigger gas tank - especially as you are still towing the three-ring circus behind you.

Pool Shark's picture



Battery solution:


slaughterer's picture

Bet AAPL turns over at $666 pps almost exactly.      

Carl Spackler's picture

How soon before the liberal Trial Lawyers Association coordinates a consumer product litigation strategy on behalf of the poor people who got burned  (the Apple Crisps) begins against the deep pockets, and the stock turns the other direction following a massive class action suit ?

SheepDog-One's picture

I dont know whats up with the 5 minute WTI chart but youre not going to be paying any less for gas, thats for sure.

Some story I caught earlier about ICE stepping in with limits on oil and gas, I dont know, its all a total clusterfuck.

SilverTree's picture

......aaaaand its alive again.



icanhasbailout's picture

Maybe efficient programming will come back in fashion? The "just throw more hardware at it" philosophy of computing has run its course, I hope.

Joe Davola's picture

Nah, just delve deeper into the compiler optimization flags.

metastar's picture

My PC has Gigabytes of memory, inconceivable CPU speed, and way more than 640K of memory, yet my old fat DOS applications worked faster than many of today's best web apps. I can't believe how long it takes some of these things to load.

Programmers today have no idea how their bloated frameworks operate internally and would be completely lost if they had to write efficient code. So instead we crank up the capabilities of the hardware to compensate.

Id fight Gandhi's picture

Hate windows and it's bloated shitty software. Vista was prime example of this.

12ToothAssassin's picture

There is more money to be made with the hardware industrial complex.

Totentänzerlied's picture

That is inherent in the design and nature of higher level languages, especially scripting languages and cross-platform runtimes/VMs, and with fancy GUIs, animations, graphics, and features in general. Higher level languages are essential to what computing is today. You know that.

Lynx is 4.3 MB with 2 dependencies, Links is 3.6 MB with 8 deps, and Firefox is 26.8 MB with 15 deps (i686 Arch Linux). It is not hard to see the progression of features and graphics.

It's Wirth's Law: "Software is getting slower more rapidly than hardware becomes faster"

Rynak's picture

I don't know: High level VMs do require a high price, yes. But you see, the concept of a "VM" ..... the conceptual model.... isn't really something that bloats up forever. HL langs by themselves (excluding bloat of ever larger standardlibs).... require a high but STATIC increase of cost.

So, even considering lower efficiency of HL langs, i do not think that in the longterm they will be the main issue.

What will be the issue, is plain bloat of application features, programmer lazyness, libraries and giant frameworks.

Bottom line: While HL VMs do have a high cost, the cost of the VM itself is mostly constant and "one-time".... while the costs of what those HL langs are used for (including the stuff bundled with them), ever increases.

Basically, the longterm enemy really just is applications and middleware... not the machines that run them.

Papasmurf's picture

Moore's law says that while transistor count can double every three years, the software inefficiency will deteriorate at the same rate, yielding constant performance.

Rynak's picture

Moore's law is based on minituarization, that worked for more than 50 years... so it was taken as "this will work forever".

But if you look at processors in the recent 5 years... and compare amount of cycles per second... it doesn't look like x2 anymore every 2 years, does it?

And yet, real performance (ignoring software bloat) managed to stay close to doubling.

How is this possible? Answer: Recent improvements of speed, are no longer primarily related to minituarization.... but increasingly on better design.

So, it is hw design, rather than minituarization, that is currently keeping moore's law established - if it were just about minituarization, moore would be dead years ago.

But the problem is: Even though rethinking chip design is long overdue... there are dimishing returns to this as well. So, if in like 10 years reached near optimal circuit board based chip design.... what will rescue moore then?

Uncle Remus's picture

If the demise of craftsmanship is any indicator, no.

Rynak's picture

While inefficient programming (mainly caused by the supposed "programmers"-department of corps, not really being programmers... but rather people capable of adding glue between "sofware-modules" bought from elsewhere.... which also is one reason why current software is so bloated: they just buy a bunch of "feature-collections", and then use SOME features of them... thus bundling an entire toolset, of which they only use like 10% of the tools)... while that certainly is the major reason for current computing efficiency...

....another one, especially on x86, is an inefficient hardware architecture. See, back in 1990, computers contained a bunch of highly specialized chips... there usually was a dedicated chip for graphics, one for sound, one for managing the entire flow of the chipset... and then a general purpose processor. Heck, even external hardware like floppy drives, came with their own processor and ram. Basically, it was more a web of autonomous interacting actors... instead of a central "god" managing and executing everything.

With the rise of the PC, most of this was thrown overboard: A central general purpose processor, that can do everything, but nothing efficiently, was used to basically manage and do everything: It would control disk-datatransfer, hdd datatransfer, currently usb-datatransfer, graphics, number-crunching, memory-management, sound, everything... a single general purpose chip, doing everything.

That surely made stuff cheaper. Now you only needed one central processor, to control the entire computer and all connected gadgets. Thus, all the other components of the computer now longer needed to be able to be "autonomous" as in capable of managing themselves.

It also made programming someway easier, because of lack of concurrency (if there are no other "actors", you in your program code only need to consider your own actions (single-threading basically).

However, as this architecture gained popularity, and speeds and features multiplied, it became slowly but continiously apparent, that this "one to rule them all"-model doesn't really scale well.

Why? Because it totally eliminates specialization and maximizes data-transfer latencies.

It eliminates specialization, because if a single chip has to do everything, it obviously has to be a jack-of-all-trades: Thus, it would basically do EVERYTHING less efficiently than i.e. a chip dedicated for a specific task. Initially, this was compensated via inflation: Just throw ever faster CPUs at the problem: Quantitative Easening. The reason why this at first seemed to work, was moore's law and minituarization: By building the CPU with ever better minituarization, the amount of computing power basically doubled every 2 years. Thus, the lack of scalability, and inefficiency of a CPU, got compensated by every two years getting double the amount of ressources out of nowhere.

However. Beginning at around 2006, this scheme - and together with is "Moore's Law" became to fall apart - for two reasons - One: Minituarization quickly approached physical limits. And now at 2012, circuit-board based CPU's are close to the highest minituarization that is physically possible (This is NOT merely a matter of going into "nanoscale"... reality to propaganda: CPUs ARE already nanoscale right now.... the brickwall you're trying to break, is that of individual ATOMS. And this is impossible to do with circuit-board style chips... to break it, you'd have to abandon the circuit-board design of chips - which is not merely a matter of some incremental technological progress - it would be a bottom-up revolution in how to design chips, with all past chipmaking research becoming irrelevant. So, it is not something that merely happens overnight. In case anyone is interested why the atomar level is a physical brickwall, i'll provide the explanation).

The second problem, is transfer latency... and discrepancy. CPU's get data as input, and then do processing on this data. However, memory - the thing that transfers data to a cpu - by now is dozens of times slower than CPUs. Basically, while CPU's became ever faster at processing data inside them, communicating with other chips (i.e. RAM) did not increase at the same speed. Which did lead to a weird situation, where CPU's in theory could process data incredibly fast, but unfortunatelly no one can transfer data to them that fast.

That the CPU in the current architecture has to control everything connected to a computer (instead of the parts being able to manage themselves) escalates this problem further: At this point, the speed at which the CPU can manage other components of the computer, does no longer increase significantly.... because, when a CPU i.e. wants to manage disk access, it has to communicate with a hdd controller, that is at the other end of the circuit board.

In numbers, it's like "i can do 2billion things per second, but communicating with anyone else than myself, happens at 0.1 Billion things per second".

How was this addressed? Well, by increasingly AGAIN making components self-autonomous.... we got GPUs, soundchips that can manage themselves, increasingly beefed up network-chips that can manage the network themselves, and so on. Oh, and CPUs increasingle get more physical registers and cache, so that they can actually do their stuff, without depending on other components (i.e. memory).

Basically, in terms of the overall architectural model, we're in the process of going back to 1990. The "Central Processing Unit"-based design, is slowly obsoleted.

Oh regional Indian's picture

Excellent overview Rynak. Thanks.


undercover brother's picture

i don't know how many of you ever bought the first generation apple tv, but that thing ran so hot it would damage furniture, overheat nearby components and possible scorch your hands.   How about their laptops?  Users of certain aapl laptops, who placed them on their laps during use, suffered from "toasted skin syndrome".    I have to hand it to aapl's PR department for being able to turn a real public safety issue negative into a non-issue. 

i_fly_me's picture

Just include a small gasoline engine with future models to keep you from being stranded with a "brick" far from home.  There, I fixed it.

WeekendAtBernankes's picture

Naw, the real answer is you have to buy two, and use one to boost power for the other.  The iPad^2

UnderDeGun's picture

Fly, u da man... err - fly. Keep buzz'n.

TWSceptic's picture

It would be neat if the next gen ipad could partly get energy from new developments in solar cell technology.

tmosley's picture

You would need a fold-out solar panel.

PV isn't meant to be portable.  No point in shoe-horning it into consumer products like that.  It might work for an e-ink reader application that draws very little power, just not for reletively heavy applications like this.

If you really want to extend the battery life, you need to do some research into making metal-air batteries such that they are rechargeable for more than ten cycles.  Make that happen, and all our fuel problems go away basically forever.  Then we just have to focus on energy production.

Mercury's picture

I'd like to see them make a low-tech Ipad with decent computing power maybe but a black and white, lower res screen that could run for days or weeks  without a charge.

Start with one of those $10 calculators with the 1 in. solar strip that runs forever...and see how much better you can make it.

RafterManFMJ's picture



Go to bed, old man!

GoinFawr's picture

awesome idea. I'll hook it up to the ol' transceiver and the only thing that could take us offline is an electric hairdryer

(only half kidding here)

aerojet's picture

Yeah good luck.  Might better have it come with its own compact, pedal-powered generator.

Rynak's picture

I own a lowtech e-ink mobile, and the only thing i miss, besides of awesome visibility and unreal battery power... is more pixels (mine basically uses the style of oldschool LED based displays.... which is a bit limiting (i.e. can't even properly get uppercase and lowercase letters).

Bottom line: HELL YES! And some battery please, that doesn't selfdestruct in a matter of 2 years, and doesn't spontaneusly explode. I don't need more power.... i need more quality and endurance out of the power i already have in my gadgets!

BreadnH2O's picture

shoe-horning at this point is correct but.... ...I think you're on the right track about energy production. You'll also note again with these carbon nano-tubes - Delta Heat!

exi1ed0ne's picture

30 seconds of Googling. OK, 10 seconds of googling for this and the rest blowing digital raspberries at the NSA.

francis_sawyer's picture

The NEWTON comes with a full solar panel array...

UnderDeGun's picture

:whistle blows: :red flag flys into the air: The referee switches on his microphone and says, "Unnecessay use of bullshit. TW. 15 yard penalty and loss of down."

johnQpublic's picture

molten salt thorium ipad up next

Sudden Debt's picture

Use more silver! use more silver!!!

Cdad's picture

Just imagine how much higher the stock would be today if the New iPad exploded when turned on?  

SheepDog-One's picture

It would be freakin AWESOME! NSA would step right in and make it required battlefield loadout gear for every soldier.

Dermasolarapaterraphatrima's picture

Better buy at the all-time high before it drops, right?