Hugo Gutiérrez (Google translation, via Adrian Tineo ):
The consultation of private recordings is conducted through a company that is a subcontractor of the apple company, just as Google does, as EL PAÍS already advanced. These reviewers are responsible for analyzing private calls and requests to the Apple Virtual Assistant.[…]
When it comes to Apple transcriptionists, the working conditions were much better than for employees who did this work for Google, even though the work is almost identical. The reviewers who were contacted confirm that they did not charge for the audio team but had a monthly salary. “You can choose the number of hours employed. In my case, I was working part-time, 30 hours a week, earning $ 1,100 gross per month. “Of course, they had a goal of meeting audio heard at around 150 files per hour. That means I had to go through about 4,500 recordings a week. […]
Where there was strict control was the number of recordings made, which in case of default was the basis for termination. “They modified it several times in the months I worked for this company. In fact, in my last few weeks there, the goal was virtually impossible to meet, and they knew it, "says a former employee.
It was previously reported that Apple had people reporting Siri audio data, but it was not known that they were contractors.
Alex Hern (MacRumors):
Apple told the Guardian: "A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user's Apple ID. analyzed in secure facilities, and all reviewers are required to comply with Apple's strict confidentiality requirements. "[…]
The Notifier said:" There have been countless cases of recording with private discussions between doctors and patients, business agreements, apparent criminal matters, sexual encounters, and so on. These recordings are accompanied by user data that shows location, contact information and app data. "[…]
" There is not much about who works there and the amount of data we stand on rides to review seem quite broad. It would not be difficult to identify the person you are listening to, especially with accidental triggers – addresses, names, and so on.
"Apple is a subcontractor, it's high turnover. It's not like people are being encouraged to take into account privacy, or even consider it. If there were someone with unpleasant intentions, it wouldn't be difficult to identify [people on the recordings]. ”
"search query" includes recording of search … and it says it shares some data with third parties, but nowhere is it simply stated that questions you ask your phone n can be recorded and shared with a stranger. There is also no way for users to opt out of this practice.
Jason Snell ( tweet ):
It doesn't matter to me whether this is Amazon or Apple. I don't want people to listen to the sound these devices are recording. In fact, I don't want recordings made by my sound, period – I want the sound to be processed and discarded immediately.
Apple is constantly boasting about taking users' privacy seriously. There is one correct answer to this report and it is to change the guidelines and communicate them clearly. To put it mildly, how the eavesdropping is done in a secure facility without an Apple ID is not good enough.
Steve Jobs: “Privacy means people know what they are signing up for, in plain English, and repeatedly … Let them know exactly what to do with their data. ”
How many Siri users know that entrepreneurs listen to when they intentionally or not trigger it?
Still, there should surely be a way to opt out altogether and not have any of your Siri calls selected for review. It is absurd that there seems to be no way to do this – turning off Siri completely is not a solution – although I have reached out to confirm that disabling sharing options for analysis in Settings would have selected users. Like Google, I have to ask why users are not even asked if a human can review their audio recording.
Stay up-to-date by subscribing to the RSS feed for this post.