Last night, Apple joined Google to join the program to get human ratings to listen to the user's voice commands recorded by the voice assistant. Apple did not say whether it actually ended if these recordings were still going on at all. I asked and did not get a clear answer.
But this problem where Siri recordings are stored on the servers ̵
Amazon, Google and even Facebook each have a specific website where you can go through privacy settings for their assistants, delete data and generally get information about what each company knows about you. Here they are, with the full URL printed (you should avoid blindly clicking on any link that claims to take you directly to your account settings):
We have written guides with more detailed instructions for deleting the data yours from both Google Assistant and Amazon Alexa.
Apple does not offer a privacy portal site for your Siri data, or any special settings screen to fix it in an app. The general privacy page is a broad set of very clear explanations of what Apple's policy is, but no specific information about your data or check boxes to delete them. The only thing you can do from Apple's website is to download or delete all of your data.
In some parts, this is a result of Apple's relatively unique, device-focused infrastructure. It is more difficult for Apple to create an online privacy portal when it focuses so much on storing data on discrete devices.
Still, Amazon and Google make it relatively easy to delete the voice data from their servers. Google also allows you to turn off voice logging on their assistants by the links above, although it may destroy some features.
The day after this story was originally published, Amazon decided to allow users to disable human review of their voice logs, but it did (and has not ever) allowed to turn off saving your recordings by default. In short, you can delete them as often as you like, but you can't prevent them from being uploaded with a setting.
Apple also does not offer the ability to use Siri without your voice being stored on its servers. Apple emphasizes that your recorded utterances are not associated with your Apple account, but it is cold comfort if you are really concerned about a human contractor who could potentially hear private information that HomePod accidentally heard in your house.
It gets worse: while you can delete the statements from Apple's servers, the process of doing so is completely unintuitive that the only way you can possibly Learn how to do it is to Google it and find an article like this.
It is possible that the future update promised yesterday will allow you to use Siri without having your voice stored on Apple servers. However, read Apple's statement carefully and you will see that opt-out is for "grading", not necessarily recording : "In addition, as part of a future software update, users will have the option to opt in grading. "
Apple's latest iOS security guide is a master class for explaining to consumers and security experts how to keep an operating system private, but check out page 69, where Apple specifies the data retention policy for your voice messages:
User voice recording stored for a six-month period so that the recognition system can use them to better understand the user's voice. After six months, another copy, without identification, is stored for Apple to use to enhance and develop Siri for up to two years. of recordings, transcripts, and associated data without identifiers can still be used by Apple for continuous improvement and quality assurance of Siri over the course of two In addition, some recordings that refer to music, sports teams and players, and businesses or places of interest are stored in the same way to improve Siri.
"What happens on your iPhone stays on your iPhone" has apparently become "What happens on Siri stays on Apple's servers, potentially forever."
How to delete your voice recordings from Apple's servers
How to delete the recorded voice tags on an iPhone – but you must repeat similar processes on each Apple device you own. What you need to do is delete all information Apple gets from Siri, including recordings of your voice. But the way you do it is not by going to the Privacy section of your settings. Do this instead:
- Go to "Settings"> "Siri & Search"
- Turn off all ways to enable Siri. There are two: "Listen for 'Hey Siri'" and "Press the side button for Siri."
- When you turn off the last way to activate Siri, Siri turns off effectively. You will be warned that there is another step you need to take to delete your Apple server data.
- Go to "Settings"> "General"> "Keyboard." Scroll down to where you see "Enable dictation." When you check to turn off, you get a warning that if you ever want to use them again, you have to go through some uploading.
You might be wondering why I'm so concerned about Apple having a problem here, but think about how weird the steps above really are. are dark patterns on several levels.
And even if you go through all this, the kicker is: you have to go through all the steps on all your Apple devices to delete Siri's data, but turning Siri or Dictation back on means that data logging will restart.
Recently, 9to5Mac pointed to a downloadable iOS profile created by a security researcher who will stop logging on the server side. It seems relatively innocent to my untrained eye, but it is never a good idea to just install profiles from the internet, so I recommend against it.
Enterprise users and schools have the ability to build and install a profile themselves, using Apple's configuration utility, which apparently disables Siri server-side logging. It is designed to help administrators manage small fleets of Apple devices, and technically against Apple's terms of service for consumers to use on their own phones. Simply clicking on the wrong box in this tool and messing up your phone, so again I recommend against it.
Unfortunately, the only way ordinary users have to prevent Apple from keeping voice recordings is to turn off Siri and Dictation and never use any of them again. Apple takes a strong privacy, but this is definitely an area where users will do better.
This is absolutely awful. But here's the good news: Apple's computer practices are much, much more private than Google or Amazon does. It does not track you for advertising purposes across the web. It will not know where you are or what you have purchased.
That's great! But it does not replace clear and obvious privacy settings for the data Apple has about you. Because Apple sure knows some things! It has an advertising business in the App Store. It knows which Apple products you own. It is the iCloud loophole, where it can really transfer your synchronized data to authorities if it is required by law, just like everyone else.
Apple collects far less data about you than Facebook, Google or Amazon does – but it's not nothing . And the surprising-not-really-surprising revelation that it stores recordings of your voice just like Google and Amazon is proof of that.
Apple has had fewer privacy scandals than everyone else in big tech (though there have still been some big ones). Apple is also trying to build technology that is private by design. But Apple has a blind spot for giving users control over the data it actually collects, and it has created some poor user interface because of it. So the really good news in all of this is that the complaint here is basically about product design – something Apple apparently knows something about.
Privacy is not an all-or-nothing thing when it comes to technology. All the other big tech giants have learned that the hard way and had to radically improve the tools they offer users to manage their data.
Now it's Apple's turn.
Correction, 12:15 pm ET, August 2. The original version of this article stated that it was possible to set a standard where Alexa will not store your voting devices on the servers. It was wrong – you could only opt out of some human review of these recordings. Google is the only company of the three that allows you to set Assistant by default so that it never stores voice playback on the servers. That section has been updated and I apologize for the error.