قالب وردپرس درنا توس
Home / Apple / Apple responds to screams over Siri rating

Apple responds to screams over Siri rating



When making privacy a mark, make sure your house is in order. Although Apple still takes user data privacy more seriously than its competitors, it does not mean that they are beyond regret. The world saw a great example of this a few days ago when we learned that third-party contractors were listening to recorded Siri interactions for grading purposes.

It's about timing and presentation, I guess. In fact, news of outsiders who classified Siri came out ahead of the more recent Guardian report. However, their article surfaced with a whistleblower who claimed that they were listening to recordings that made it possible to reveal users' identities and that some recordings captured very private moments and even illegal activity.

The answers were pretty predictable all around. The technical press jumped hard on it. Many enthusiasts enjoyed a little schadenfreude at Apple's expense, while others just shrugged and it wasn't that great. Many normal people who heard about it reacted. Apple first responded with a generic description, claiming that what they did was not bad.

A small portion of the Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user's Apple ID. The Siri responses are analyzed in secure facilities, and all reviewers are required to comply with Apple's strict confidentiality requirements.

That was clearly not enough. After the story didn't go away, Apple wisely pulled the plug on Siri rating, at least for now. In the near future, they will offer either an opt-in or opt-out for human grading of Siri. Here's Apple's exact statement to TechCrunch:

"We are committed to delivering a great Siri experience while protecting users' privacy," Apple said in a statement to TechCrunch. "While doing a thorough review, we are suspending Siri rating globally. In addition, as part of a future software update, users will have the opportunity to choose to participate in rating."

Absolutely, some of this surprises me. I've always assumed that Apple used human analysis as part of Siri's development and improvement, and each player in the voice recognition area uses human analysis and grading of some kind because it's still needed. Other than how they handle it, why should Apple be different ? The only part I was out of was thinking that Siri rating was included in Apple's broader Analytics opt-in that comes up during device setup, it turns out there was no way to opt out of this besides beating Siri

This was Apple's real flaw. Because of Apple's privacy policy, this should have been spelled out in device setup and opt-in, just like any other options they have for However, the actual grading is a necessity. It can certainly be argued that Apple should include this rating internally, because relying on third parties does not inspire user confidence. The NDA, as Apple pointed out in its first statement, did not stop the Guardian's alert. Who else plays fast and loose with what they hear? How safe is this "safe facility" they were talking about?

Despite what anyone can tell you, all voice assistants still need a certain amount of human interaction to move forward with all speed. Siri needs it even more, as Apple does not have the same open data streams of user data that Google and Amazon use, or the same amount of data they have available. And let's be honest here ̵

1; Siri is already far behind the competition due to many years of mismanagement. Apple seems to have the right team in place to lead Siri forward now, but they still need people grading to take full advantage of the data they have.

Apple already anonymized the Siri recording user data being analyzed. Now they will take the extra step of letting users either opt out or exit Siri rating, which is certainly the right move. Based on how they handle other such situations, opt-in is probably the best way to handle it. This will definitely set Apple back, as they will have less data to work on for analysis and improvement. Such is the price when you make privacy a core feature.

Hopefully, Apple won't stop here. As I already said, I definitely think they should take this operation in-house so that they can monitor the grading process closely. I also think it would be a wise move for Apple to prioritize detecting fake "Hey Siri" triggers through software enhancements. According to the Guardian report, most of the private moments and information they heard were recorded under fake triggers. If they can make progress in limiting the grading process to legitimate Siri interactions, users will feel better about letting Apple use their data to improve the service.

Dieter Bohn from The Verge also had some really interesting points about this Siri situation today. He praised Apple for their attitude and focus on privacy, but claimed it also left them some blind spots. The current situation seems to be wearing it out. Even his biggest complaint was that Apple, even with a Siri signup or termination, still does not and has not committed to adding a way to clean your data from its servers. Right now, when your recording data from Siri is there, they are there for a long time. He made a very good point that Google, Amazon and Facebook have had to deal with this and come up with policies and procedures because they store so much user data. Apple's experience in dealing with such a situation shows here, and they need to address it by giving users more control over the data Apple holds on them.

As for me, when Apple offers an opt-in or out for this later this year, I will continue to allow them to use my data to improve Siri. I know Apple is not perfect. For all their talk about privacy, they still make mistakes. They definitely did in this case. However, they still do the best job in the industry of prioritizing user privacy. They already anonymized my data which may have been rated by someone. With a little extra control of the process, I still have no problems letting Apple use my data to help Siri. I still use it enough that I want to see it improve and expand. Now that they have better management over the service, I actually have some confidence that it can reach.

What about you? What do you want to do when this new Siri input or sales setting is available?




Source link