Last week, we reported a shocking revelation about Apple's friendly
personal assistant tool for mass surveillance Siri: The app "regularly" records people having sex, discussing private medical information, participating in drug deals and other "countless" invasive moments which it promptly sends to Apple contractors for their listening pleasure - all for the sake of "quality control".
Now, in the face of the latest privacy scandal to rock one of the major American tech giants, Apple has reportedly decided to suspend its global internal program for “grading” users' Siri commands following the public backlash over the Guardian's revelations, which came courtesy of a whistleblower.
Previously, Apple contractors listened to less than 1% of Siri commands as part of the program, which was intended to improve the quality and accuracy of the voice-based digital assistant. Yet, with hundreds of millions of users worldwide, the Cupertino-based consumer tech giant was effectively monitoring a huge chunk of the American population without their explicit knowledge.
Suspicions about personal assistants being used to inadvertently surveil their users first surfaced. Last spring, Amazon acknowledged that hackers could turn its Echo speakers into wiretaps. And who could forget that creepy laughter?
Now, Apple says a forthcoming software update will allow users to voluntarily opt in to the Siri grading program that had exposed their private recordings to teams of anonymous contractors, Bloomberg reports.
"We are committed to delivering a great Siri experience while protecting user privacy," Apple said in a statement. "While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading."
Apple's decision closely follows a German regulators' decision to temporarily stop Google employees and contractors from transcribing home assistant voice recordings in the EU after whistleblowers revealed that some of these recordings included sensitive information. A Hamburg agency said Aug. 1 that Google has agreed to stop the program for three months while it investigates whether the practice complies with the EU’s new GDPR data-privacy regulations.
Like Google and Amazon, Apple’s employs contractors who work to gauge whether the digital assistant’s interpretation of users' requests makes sense based on what was said. The records that are reviewed purportedly are stripped of personally identifiable information, and are stored for six months as part of the "grading" program. While Apple explained the program in a security white paper, it didn't disclose the program in iOS.
Of course, even when presented with the option to opt-in, we doubt most users will realize exactly what they're agreeing to, and will instead mindlessly grant Apple access to their most intimate moments, to the delight of the contractors who spend all day auditing their recordings. How long until one of them breaks Apple's "strict" confidentiality requirements, and decides that the potential profits from blackmailing users outweigh the lousy pay that comes with being a non-salaried contractor?