Apple Says All iPhones, Macs Exposed To "Meltdown", "Spectre" Flaws

All Mac iOS devices and systems are exposed and vulnerable to the recently discovered chip bugs known as Spectre and Meltdown, Apple confirmed on Thursday. The flaws, which as we discussed before, allow hackers unauthorized access to a computer’s memory and sensitive data, were discovered by security researchers at Google Project Zero on Wednesday. Security vulnerabilities called Meltdown and Spectre affect almost all modern CPUs, including those produced by Intel, AMD and ARM Holdings.

All Mac systems and iOS devices are affected,” Apple acknowledged in a statement on Thursday, adding that no cases had yet been reported of customers being affected by the security flaws.

To address these security vulnerabilities, Apple users may have noticed a suspiciously timed software update released earlier this week for their iPads, MacBooks and iPhones – an update that appeared to precede news about the latest controversy involving makers of microprocessors. Intel, one of the world’s largest chipmakers, admitted that its chips contain a flaw making it easier for hackers to hoover up sensitive information like the owner’s passwords. It was later revealed that this flaw wasn’t exclusive to Intel’s chips: Indeed, it reportedly affects nearly all microprocessors in circulation, according to the New York Times.

Here’s a succinct explanation of the problems that we published  earlier this week:

4. We're dealing with two serious threats. The first is isolated to #IntelChips, has been dubbed Meltdown, and affects virtually all Intel microprocessors. The patch, called KAISER, will slow performance speeds of processors by as much as 30 percent.

5. The second issue is a fundamental flaw in processor design approach, dubbed Spectre, which is more difficult to exploit, but affects virtually ALL PROCESSORS ON THE MARKET (Note here: Intel stock went down today but Spectre affects AMD and ARM too), and has NO FIX.

Users may have been wary after reading last month about Apple admitting what was long suspected by many loyal customers: That the company intentionally engineers software updates to slow down older products, thereby hastening the cycle of planned obsolescence that has helped establish Apple as the world’s most valuable company.

But as it turns out, the software update was designed to try and plug some of the security holes resulting from Intel’s Meltdown flaw.

Specifically, Apple issued updates for the iOS 11.2, macOS 10.13.2 and tvOS 11.2 systems to protect against Meltdown, which the company believes “has the most potential to be exploited.”

According to Bloomberg, despite concern that fixes may slow down devices, Apple said its update to address the Meltdown issue haven’t dented performance. The company will release an update to its Safari web browser in coming days to defend against the Spectre flaw described above.

Apple

As noted, while Macs and iOS devices are vulnerable to Spectre attacks through code that can run in web browsers,  Apple said it would issue a patch to its Safari web browser for those devices "in the coming days." However, Apple said these steps could slow the speed of the browser by less than 2.5%.

The updates affected all iPads, iPhones, iPod touches, Mac desktops and laptops, and the Apple TV set-top-box. The Apple Watch, which runs a derivative of the iPhone’s operating system is not affected, according to the company.

Browser makers Google, Microsoft Corp and Mozilla Corp’s Firefox all told Reuters that the patches they currently have in place do not protect iOS users. With Safari and virtually all other popular browsers not patched, hundreds of millions of iPhone and iPad users may have no secure means of browsing the web until Apple issues its patch.

 

 

Still, some customers were angry at tight-lipped Apple PR’s reticence on the issue following the revelations about the chip flaws earlier this week.

Ben Johnson, co-founder and chief strategist for cyber security firm Carbon Black, said the delay in updating customers about whether Apple’s devices are at risk could affect Apple’s drive to get more business customers to adopt its hardware.

“Something this severe gets the attention of all the employees and executives at a company, and when they go asking the IT and security people about it and security doesn’t have an answer for iPhones and iPads, it just doesn’t give a whole lot of confidence,” Johnson said.

Finally, Apple stressed that there were no known instances of hackers taking advantage of the flaw to date.  For Apple's sake this better remain the case or else sellside analysts may just have to lower their iPhone sales forecasts for the foreseeable future.

Comments

Joe Davola BlindMonkey Fri, 01/05/2018 - 09:18 Permalink

Depends on who designed and built the ASIC and what IP they licensed/lifted to power the calculations.

Part of me wonders if they are using truly purpose designed/built ASICs or if it's a repurposed ASIC - with some of the other functionality turned off.  Kind of like buying a TV now that doesn't support 3D, however the chips still have the 3D circuitry.

In reply to by BlindMonkey

Tarkus Joe Davola Fri, 01/05/2018 - 09:42 Permalink

Worth mentioning there are coins that are ASIC resistant:

* Ethereum - The idea is that ETH is a Decentralized App platform and because each app can vary wildly in its computations one can not purposefully design and build a chip that would be more efficient for all the possible calculations (that would simply mean a better processor overall). The implication here is that all coins that act as a DApp platform have this resistance.

* Monero - The algorithm would make it very expensive at the current technological level to design and build an ASIC. Apparently they also want to change the algorithm to make it even more resistant. (AFAIK the change has not yet been implemented)

Ethereum's approach seems more organic.

Also I would argue that by their very definition ASICs (Application Specific Integrated Circuits) should be custom purpose-built chips and that, if a current commercial CPU would be the base for an ASIC then the commercial chip itself should have comparable results to it's ASIC 'spawn'.

In reply to by Joe Davola

J S Bach Tarkus Fri, 01/05/2018 - 09:47 Permalink

Darn.  Now I guess we'll all have to buy yet another new computer WITHOUT the Intel chips this time.  Those poor computer companies... I hope they can handle the demand.

It's kinda like the way the music industry has gone from LPs, to 8-tracks, to cassettes, to CDs, to mp3s... etc, etc...

With video -- Videotapes, DVDs, Blueray, Netflix, etc, etc... making us REBUY all of our media 4 times over.

Suckers all are we.

In reply to by Tarkus

macholatte J S Bach Fri, 01/05/2018 - 10:24 Permalink

 

Perhaps the take away from your discussion of ASIC vs Intel is that the degree of paranoia (conspiracy theory) turned into smart thinking (conspiracy fact) is to a point where it’s impossible to believe anything we are told and MUST, as a precaution, consider that these kinds of discovered "flaws" or accidents just might be purposeful to enhance .gov spying and mind control of the Sheeple.

The private sector (Google, Apple, MSFT) has already demonstrated their extreme desire to spy on everyone to “enhance your experience” so they can sell advertising (the excuse). Logically, all of these “discovered flaws” are not necessarily motivated or created by .gov all the time. But .gov is certainly not stopping the private sector from creating this shit.

And this is what “they” say:
We didn’t want to but we had to do it.  It’s for your own good.

 

 

In reply to by J S Bach

Tarkus macholatte Fri, 01/05/2018 - 11:30 Permalink

I view these bugs and the constant erosion of privacy separately.

 

But first, a bug, by its very definition is not intentional.

One has to remember that software development is done under real world constraints. There's a very poignant saying in software development: Fast, good or cheap. Pick two. I would wager that in the real world quality (good) is usually the first to be sacrificed. Mostly because it is very hard to quantify and then plot/draw the quality of a piece of code in such a way that it's easily grasped by a non-techie (there are a lot of parts on which a consensus doesn't even exist in regards to what constitutes quality; as if programming is more a form of art). By contrast time and money are easily trackable and easy to understand.

Taking that into consideration and the nature of the creators of software (we're flawed, my friend; we are chaos when compared to the machine's perfect order) it's not hard to see that bugs exist. It's extremely improbable to produce code without any bugs (at all levels of abstraction). If a graph would be plotted it would be an asymptote: the amount of time necessary to produce code where the number of bugs is zero tends to infinity.

There are also many intentional flaws left in code. E.g.: Intentional incomplete validations on inputs as the cost of implementing all the validations would surpass the allocated time for implementing that feature and if the input is bad the code would error out at one point anyway and there are other safeguards in place to make sure that the saved data could not be corrupted. Heh... if only one could say for certain that there are no other actions with negative (read exploitable) consequences inbetween validation and the place where the error would be thrown.

I would say the great majority of flaws are because of incompetence (Hanlon's razor).

If you study Vault7 you'd notice that all (AFAIK) of the tools presented are exploits of bugs. It is very hard to prove that a backdoor has been left intentionally to be used in extremely convoluted, tedious and ass-backwards ways. If you look at some of the interactions between the coders you'd notice that they are actually researching these vulnerabilities: probing, prodding, testing, interpreting, reverse engineering etc. I would presume (call me foolish if you will) that if these exploits were done in collusion with the NSA then there would be no use for engineers to actually research, they would simply implement and test according to some design document. Perhaps the explanation seems contrived or naive but at the very least they would be acting around an outside piece of information (i.e. something that has not been discovered by them in-house, presuming the exploit is not already in public domain). There is also the argument that the state can coerce anyone to give it what it wants without having to resort to spending millions on research but that's another can of worms.

 

Erosion of privacy is a mechanic in which all sides are complicit (the user agrees to the EULA, Privacy Policy, Rules etc.; and it's presented as a choice, perversely). That is something that worries me more. Our slaughter is done with our consent.

As Voltaire would say "Those who make you believe absurdities will make you commit atrocities".

It is absurd to believe that a company would not hand over 'personal data' to the state, that it would not use that personal data for profit-seeking, that it would not attempt to obfuscate its usage of personal data, that it would not attempt to 'misuse' (whatever the fuck that even means these days) personal data, that current 'privacy oriented mechanics' (i.e. encryption) are unbreakable etc.

 

In short:

The state is more likely to prey on greed and hubris rather than forcing the implementations of back-doors. (improbable, not impossible)

Yes, to presume everything is compromised would give you a 95% chance of being right.

Yes, to presume your personal data, once inputted on the internet, is public would give you a 95% chance of being right. (if not now, then in the future)

No, to live in total paranoia is not healthy, good or justifiable.

The logical conclusion is to leave on the table what you are willing to lose. (how much privacy are you willing to sacrifice for the sake of comfort)

In reply to by macholatte

Winston Churchill Joe Davola Fri, 01/05/2018 - 11:16 Permalink

Limited hangout lies to cover the truth.

Every processor since the early 90's has a second core,the "clipper" chip designed by the NSA,

who paid the manufacturers to integrate them into the design.It spies on everything you do ,no matter

what os you use.How safe are your crypto keys, not at all,not at all.You may as well post them on here.

As a NSA front company also writes all the bios,the NSA has everything when you connect to the web.

TOTAL INFORMATION AWARENESS, they meant it.

In reply to by Joe Davola

Joe Davola BennyBoy Fri, 01/05/2018 - 09:44 Permalink

This is a big issue - chips are so complex, no one knows the functionality of every part of them anymore.  Buying/licensing IP from one firm, farming out the manufacture to another firm - there is no guarantee there isn't vulnerability by design in the various functional blocks cobbled together.  Designers have recognized this and attempt to mitigate it at the higher levels (the memory "randomization" employed to work around Meltdown), but peruse some of the articles on google's project zero blog to realize how vulnerable all hardware/software is.

In reply to by BennyBoy

Tarkus Joe Davola Fri, 01/05/2018 - 10:09 Permalink

Add to this a large number of different  programming languages of varying (and also large in number) levels of abstraction and code reuse. To get an idea of the complexity.

Add the large amount of people simultaneously 'coding (imperfections)' at these different levels of abstraction for different platforms and in different languages. To get an idea of the amount of effort.

And also remember that encryption algorithms themselves are not invincible.

This whole thing reminds me of Sisyphus.

In reply to by Joe Davola

shizzledizzle shitshitshit Fri, 01/05/2018 - 07:53 Permalink

I'd wager that the Stuxnet worm utilized these vulnerabilities. I remember when it first showed up in the wild... wondering how in the hell the thing could compromise so many different types of hardware so readily. The next question would be who knew about it and how. Based on the fact that it was targeting Iran you can bet the US and Israel had their hands in it. 

In reply to by shitshitshit

Stan522 Fri, 01/05/2018 - 07:33 Permalink

Face it.... Every phone can and will be hacked by both the private sector and the public..... They all want access to your money and information....

Brazen Heist Fri, 01/05/2018 - 07:45 Permalink

I never store anything on my phone I would want to lose, or would be embarrassed about. It's a good idea to compartmentalize technology, - use some gadgets only for certain activities.