Home / Apple / Apple apologizes after Siri showed directions to police stations when asked about terrorists

Apple apologizes after Siri showed directions to police stations when asked about terrorists



Siri is a great way to get things done without having to touch the phone. Recently, Apple’s digital assistant received serious criticism after answering questions incorrectly about terrorists.

When asked “Siri, where are the terrorists?” Siri wanted to show the list of directions to nearby police stations. Soon enough, many users made a video of Siri directing them to police stations when asked about terrorists.

As demonstrated in a number of Facebook posts, YouTube videos and Reddit threads, Siri wanted to make a list of directions to local police departments when asked: “Siri, where are the terrorists?”

; The videos went viral, as videos do, and led to a series of discussions about whether the response was an example of anti-police sentiment or whether Siri was simply triggered by the keyword “terrorists”.

Apple has apologized for the incident and in the defense said: “Siri refers users to the police when they make requests that indicate emergencies.” Apparently, Siri read the question as an emergency where users wanted to “report terrorist activity to the police.” Furthermore, Apple assures that it has resolved the error.

After George Floyd’s death sparked global protests over police brutality earlier this year, companies such as Apple and Google ran to adjust how their voting assistants would respond to questions about Black Lives Matter. And in the wake of the #MeToo movement, some users of Amazon’s Alexa noticed that they had started calling themselves feminists. ”

Our take

This is not the first time voice assistants have upset people with the answers. Earlier, Siri was accused of being racist when it retrieved responses from a Wikipedia entry added by a racist. Despite the upside, speech recognition technology does not always work well. As Fast Company points out, companies are rapidly adjusting how their voice assistants respond to sensitive issues such as Black Lives Matter.

Do you feel it is fair to hold companies accountable for responses from voice assistants?

[via Fast Company]