قالب وردپرس درنا توس
Home / Apple / Apple contractors often hear sensitive recordings accidentally captured by Siri: Report

Apple contractors often hear sensitive recordings accidentally captured by Siri: Report



Apple is in hot water after a new report describes how Apple entrepreneurs often hear sensitive recordings captured by Siri.

According to a new report from The Guardian the main question is that Siri may be inadvertently triggered by audio signals, including everything from the sound of a zip to the word Syria, as well as movements like raising an Apple Watch in a certain angle. These unintentional activations are not only frequent; At Watch, they also record 30 seconds of recordings, a fraction of which is shared with Apple's entrepreneurs for analysis, where they "can gather a good idea of ​​what's happening."

"There have been countless cases of recording with private discussions between doctors and patients, business deals, apparently criminal matters, sexual encounters, and so on," said the Guardian source . "These recordings are accompanied by user data showing location, contact information and app data. "

While contractors do not specifically listen to private activities in the recordings, the notifier claims that Apple does not" make much of who works there and the amount of data we are free to review seems quite good broad … If there were anyone with unpleasant intentions, it would not be difficult to identify [people on the recordings]. "

The source speaks to [19459005TheGuardian says they were motivated to speak out because of fear that such information was being misused.

"There is not much assessment of who works there and the amount of data we are free to look through seems quite broad, Sa source." It would not be difficult to identify the person you are listening to, especially with accidental triggers – addresses, names and so on. "

While Apple says, in response to the report, that the recordings taken by Siri and submitted for grading are" pseudonymized "," not associated with the user's Apple ID, "and" analyzed in secure facilities " of reviewers working under "strict confidentiality requirements", it is far from what happens on your iPhone living on your iPhone.

The latest report follows others on how Amazon and Google follow similar procedures. Last, Google was forced to ask excuse after more than 1,000 Google Assistant audio recordings were leaked by a contractor, many of which were clearly recorded in error, Amazon has also been under fire to not anonymously voice voice data retrieved by Alexa


Source link