قالب وردپرس درنا توس
Home / Apple / AI voice assistants reinforce harmful gender stereotypes, says a new UN report

AI voice assistants reinforce harmful gender stereotypes, says a new UN report



Artificial intelligence-powered speech assistants, many of which are standard for female sound, deletion of harmful gender stereotypes, according to a new study published by the United Nations.

With the title "I would blush if I could," after an answer Siri utter when receiving specific sexually explicit commands, the paper examines the effect of bias in AI research and product development and potential long-term negative consequences of the conditioning community, especially children, to treat these digital speech assistants as undoubtedly helpers who only exist to serve owners unconditionally. It was authored by the United Nations Educational, Scientific and Cultural Organization, otherwise known as UNESCO.

The paper claims that by naming voice assistants with traditional female names, such as Alexa and Siri, and reproducing the voices as female sound as standard, technology companies have assumed that users fall back on ancient and harmful perceptions of women. Furthermore, the paper argues that technology companies have not been able to build proper protection against hostile, abusive and gender-based languages. Instead, most assistants, like Siri, tend to deflect aggression or chime in with a tired joke. For example, Siri asks to make you a sandwich, and the voice assistant will respond with, "I can't. I have no spices."

. "Companies like Apple and Amazon "Staffed by overwhelming male engineers have built AI systems that allow their feminized digital assistants to greet verbal abuse with catch-me-if-you-can flirting," the report says. "Because the voice of most voice assistants is female, it sends a signal that women are … full and eager to help helpers, available by pressing a button or with a blunt voice command such as "hi" or "OK" & # 39 ;. The assistant has no authority over the agency beyond what the commander asks for it. It honors commands and responds to questions regardless of tone or hostility. "Much has been written about the pitfalls of technology companies that have built their entire consumer-based AI platforms on the image of traditional, Hollywood-inspired ideas of entertaining intelligences. In the future, there are probably speech assistants who are the primary mode of hardware and hardware interaction. software with the increase of so-called ambient computing, when all kinds of internet-connected gadgets exist around us all the time. .) How we interact with the increasingly sophisticated intelligence that drives these platforms can have deep cultural and sociological effects on how we interact with other people, with service workers, and with humanoid robots taking on greater roles in daily life and workforce. [1

9659007] But as Business Insider reported last September Amazon chose a female sound because market research indicated that it would be received as more "sympathetic" and therefore more useful. Microsoft, on the other hand, called his assistant Cortana to knock on the existing recognition of the very female identifying AI character in his Halo video game franchise; You can't change Cortana's voice to a male man, and the company hasn't said when it plans to let users do it. Siri, for what it is worth, is a Scandinavian name traditionally for women which means "beautiful victory" in old norway. In other words, these gender-based decisions on AI assistants were made on purpose, and what sounds like comprehensive feedback.

Technical companies have made an effort to move away from these early design decisions steeped in stereotypes. Google now refers to the various Assistant voice options, which now include different accents with male and female options for each, represented by color. You can no longer choose a "male" or "female" version; Each color is randomly assigned to one of eight voice options for each user. The company also rolled out an initiative called Pretty Please Reward Young Children When Using Expressions like "Thank You" and "Thank You" while interacting with Google Assistant. Amazon launched something similar last year to encourage polite behavior when talking to Alexa.

Nevertheless, as the report says, these features and gender options do not go far enough; The problem can be baked into the AI ​​and technological industries themselves. The field of AI research is predominantly white and male, a new report from last month found. Eighty percent of AI graduates are men, and only 15 percent of AI researchers on Facebook and only 10 percent with Google are women.

UNESCO says that solutions to this question would be to create as close to gender neutral assistant voices as possible and to create systems to counter gender-based insults. In addition, the report states that technology companies should move away from conditioning users to treat AI as they would be a smaller, entertaining human being, and that the only way to avoid persistent harmful stereotypes like these is to remake voice assistants that target non-human devices.


Source link