Image by Apple
The Apple world was stunned last month when a whistleblower revealed that Apple contractors were listening to recorded Siri interactions, a process Apple calls “grading” and that is intended to improve Siri interactions (see “Apple Workers May Be Listening to Your Siri Conversations,” 29 July 2019). It’s a common industry practice—Amazon, Google, and Microsoft all do it to improve their respective voice assistants—but Apple users had assumed that Apple’s stance that privacy is a fundamental human right would preclude such clearly creepy behavior.
Apple quickly suspended the program and told TechCrunch that it would be making changes (see “Apple Suspends Siri’s “Response Grading” Eavesdropping,” 2 August 2019). Now Apple has apologized formally, saying:
As a result of our review, we realize we haven’t been fully living up to our high ideals, and for that we apologize.
The company also said that it plans to resume the grading program once updated versions of its operating systems become available within the next few months, with the following changes:
- The resumed program will be opt-in only.
- Apple will no longer retain audio recordings from Siri but will continue to rely on computer-generated transcripts to improve Siri.
- Only Apple employees will be allowed to listen to Siri audio samples.
- Apple will “work to delete” any recordings of inadvertently triggered Siri interactions.
In a support note titled “Siri Privacy and Grading,” Apple also pointed out that less than 0.2% of Siri requests were reviewed under the grading program and that Siri uses a random identifier while processing your data, which is never tied to an Apple ID. Apple said that Siri uses as little data as possible to answer queries. For instance, when you ask Siri to read your messages, the text of those messages are never passed to a server.
Apple is doing the right thing here, but it’s regrettable that it took a whistleblower to prompt this change. That fact alone damages Apple’s privacy-focused image. It’s also unfortunate that Apple isn’t taking Adam Engst’s suggestion to empower Siri users by letting them make their own corrections—see “Why Can’t Users Teach Siri about Its Mistakes?” (14 August 2019). And there’s a human cost to Apple’s privacy reforms: at least 300 contractors have lost their jobs as a result.