"This Is About Control, Not Children": Eric Weinstein Calls Out Apple's Virtuous Pedo-Hunter Act

Tyler Durden's Photo
by Tyler Durden
Sunday, Aug 08, 2021 - 02:15 PM

Last week, Apple announced that they would begin analyzing images on its devices before they're uploaded to the cloud in order to identify child pornography and report it to the authorities, sending privacy advocates through the roof.

Apple defended the decision - claiming there's a '1 in 1 trillion chance of false positives.'

"Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC (National Center for Missing and Exploited Children) and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users' devices," the company said in an announcement.

Privacy advocates have pointed out the obvious slippery slope of allowing big tech to infiltrate our personal lives under the guise of fighting [evil thing], and the next thing you know Apple is hunting dissidents for human rights abusers, reporting who owns what guns, or people taking 'suspicious' routes that deviate from their normal pattern. As the Electronic Frontier Foundation notes:

We’ve said it before, and we’ll say it again now: it’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses.

All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change. -EFF


They're hypocrites anyway

Hitting it on the nose, as usual, is Eric Weinstein - who calls out Apple's Tim Cook for scanning our *private* photos for child porn, while utterly ignoring Jeffrey Epstein's high-level child-sex trafficking and ties to intelligence.

Also critical of Apple is WhatsApp head Will Cathcart - who says he's "concerned" about Apple's announcement, and "a setback for people's privacy all over the world."

More via Threadreader App

Child sexual abuse material and the abusers who traffic in it are repugnant, and everyone wants to see those abusers caught.
We've worked hard to ban and report people who traffic in it based on appropriate measures, like making it easy for people to report when it's shared. We reported more than 400,000 cases to NCMEC last year from @WhatsApp, all without breaking encryption.
Apple has long needed to do more to fight CSAM, but the approach they are taking introduces something very concerning into the world.
Instead of focusing on making it easy for people to report content that's shared with them, Apple has built software that can scan all the private photos on your phone -- even photos you haven't shared with anyone. That's not privacy.
We’ve had personal computers for decades and there has never been a mandate to scan the private content of all desktops, laptops or phones globally for unlawful content. It’s not how technology built in free countries works.
This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable.
Will this system be used in China? What content will they consider illegal there and how will we ever know? How will they manage requests from governments all around the world to add other types of content to the list for scanning?
Can this scanning software running on your phone be error proof? Researchers have not been allowed to find out. Why not? How will we know how often mistakes are violating people’s privacy?
What will happen when spyware companies find a way to exploit this software? Recent reporting showed the cost of vulnerabilities in iOS software as is. What happens if someone figures out how to exploit this new system?
There are so many problems with this approach, and it’s troubling to see them act without engaging experts that have long documented their technical and broader concerns with this.
Apple once said “We believe it would be in the best interest of everyone to step back and consider the implications …”
…”it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.” Those words were wise then, and worth heeding here now.

More notable hot takes: