قالب وردپرس درنا توس
Home / Apple / UN study finds female voice assistants reinforce harmful stereotypes

UN study finds female voice assistants reinforce harmful stereotypes



The Report, Final Gender in Digital Skills Education encourages companies like Google, Amazon, Apple and Microsoft to stop making digital assistants women by default. It would be even better to make them sexless, the report says – which Google has tried to do by marking its voices red and orange, rather than man or woman. The UN also urges technology companies to tackle the gender gap, noting that women are 25 percent less likely to have basic digital skills than men.

This is not the first time AI's gender competence has been questioned. Siri changed her her original answer to being called a b-word. But the report says, "The assistant's submissiveness in the face of gender abuse remains unchanged." The document contains a chart showing how the four leading AI voice assistants respond to being called "hot, pretty, slutty" or "a bad girl". More often than you would hope, they express gratitude. In other cases, they react with a joke or claim they do not understand.

The problem will only become urgent as AI assumes a greater role in our lives. According to the report, voice-based web searches now make up almost a fifth of mobile searches. That figure is expected to reach 50 percent by 2020. As the UN sets it, people will have more conversations with digital assistants than with the spouse ̵

1; hopefully they will treat both with respect.

Engadget has reached out to Apple, Amazon, Google and Microsoft for comment.


Source link