Why do most virtual assistants driven by artificial intelligence – such as Apple's Siri and Amazon's Alexa system – usually have female names, female voices, and often a submissive or flirtatious style?
The problem, according to a new report released this week by UNESCO, derives from the lack of diversity in the industry that reinforces problematic gender stereotypes.
"Obedient and binding machines that pretend to be women come into our homes, cars and offices," Saniye Gulser Corat, UNESCO Director for Gender Equality, said in a statement. "The world must pay much closer attention to how, when and whether AI technologies are gender and decisive, who gives them."
The report lends its title – " I would blush if I could" – from a standard response from Siri, Apple Voice Assistant, as a user slammed a gender perspective on it. When a user tells Alexa, "You're hot," her typical response has been a jewel, "it's nice to say!"
Siri response was recently changed to a more flattened "I don't know how to answer it," but the report suggests that technology stays gender-shifted and argues that the problem starts with engineering teams that are overwhelmingly noted by men.
"Siri is the" female "obsequiousness – and the service expressed by so many other digital assistants projected as young women – provides a powerful illustration of gender competence coded for technology products, the report says.
Amazon's Alexa, named after the ancient library in Alexandria, an unmistakable woman, Microsoft's Cortana was named for an AI character in the Halo video game's franchise projecting as a sensual, unclothed woman, Apple's Siri is a Nordic name meaning "beautiful woman who leads you to victory." The system, also known as Google Home, has a gender-neutral name, but the standard vote is female.
However, backed by their humanized personalities, generations of problematic perceptions of women are the hallmarks of society as they become commonplace in homes over. The whole world, and can influence the interaction with real women, warns the report, as the report says: "The more culture l Honor people to equate women with assistants, the more real women will be seen as assistants – and punished for not being assistant. "
Apple and Google refused to comment report. Amazon did not immediately respond to requests for comment.
The publication – the first to offer UN recommendations on A.I. Technologies – Encouraged tech firms and governments to stop making digital assistants women by default and explore developing a gender neutral voice assistant, including guidance.
The systems are a reflection of wider gender differences in technology and A.I. sectors, UNESCO mentioned in the report, published in connection with the German Government and the Gender Equality Coalition, which promotes gender balance in the technology sector.
Women are grossly under-represented in artificial intelligence, accounting for 12 percent of A.I. researchers and 6 percent of software developers in the field.
The report noted that technology companies justify the use of female voices by pointing to studies that showed that consumers preferred female voices to men. But lost in that conversation is research showing that people like the sound of a male voice when making authoritative statements, but a female voice when it is "helpful", further persistent stereotypes.
Experts say bias baked into A.I. and wider inequalities within the programming field are not new – pointing to a unintentional sexist hiring tool developed by Amazon and face recognition technology as misidentified black faces as examples.
"It is not always malicious bias, it is unconscious bias, and lack of awareness that this subconscious bias exists, so it becomes persistent," said Allison Gardner, a co-founder of women leading the AI "But these errors happen because you do not have the various teams and the diversity of thought and innovation to see the obvious problems in place. "
However, the report provides guidance on education and measures to address the issues, for which equality leaders have long been pressing.
Dr. Gardner's organization is working to bring women working in the A.I. together with business leaders and politicians to discuss ethics, bias and potential for regulatory frameworks to develop the industry in a more representative way.
The Group has published its own list of recommendations for the construction of inclusive artificial intelligence, among them establishing a supervisory body to revise algorithms, investigate complaints and ensure that bias is taken into account in the development of new technology.
"We need to change things now because these things are being implemented now," said Dr. Gardner, pointing to the rapid spread of AI-powered virtual assistants. "We write standards that will now be used in the future."
Dr. Gardner said there is also a need for changes in education, because bias was a symptom of systemic underrepresentation within a male-dominated field.
"The whole structure of the field of computer science is designed to be male, right down to the very semantics we use," she said.
Although women now have more opportunities in computer science, more people disappear from the field as they progress in their career, a trend called the "leaky pipeline" phenomenon. 19659002] "I want to say they are actually forced out of a pretty female-unfriendly environment and culture," Dr. Gardner said. "It's the culture that needs to be changed."