Alexa, Siri and other voice assistants are reinforcing sexist tropes, the study said.
CNN Business  — 

Siri, Alexa and other female-voiced AI assistants are perpetuating gender stereotypes and encouraging sexist and abusive language from users, a UN report has said.

The report by UNESCO warns of the negative consequences of the personal assistants, claiming they perpetuate the idea that “women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command.”

It also highlighted the passive and polite responses the assistants give when users make sexually abusive remarks, warning that their algorithms are reinforcing sexist tropes.

“The assistant holds no power of agency beyond what the commander asks of it,” the report states. “It honours commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment.”

“What emerges is an illusion that Siri — an unfeeling, unknowing, and non-human string of computer code — is a heterosexual female, tolerant and occasionally inviting of male sexual advances and even harassment. It projects a digitally encrypted ‘boys will be boys’ attitude.”

Hundreds of millions of people use personal assistants, and the four main offerings — Apple’s (AAPL) Siri, Amazon (AMZN) Alexa, Microsoft’s (MSFT) Cortana and Google (GOOGL) Assistant — are all voiced by women as a default setting.

The report was named “I’d Blush If I Could,” which is the response Siri once gave when users said “You’re a slut.”

The UNESCO report outlined a number of similarly polite and accepting responses made when people use sexist language. To the same insult, Alexa responded, “Well, thanks for the feedback,” it said.

“Siri responded provocatively to requests for sexual favours by men (‘Oooh!’; ‘Now, now’; ‘I’d blush if I could’; or ‘Your language!’), but less provocatively to sexual requests from women (‘That’s not nice’ or ‘I’m not THAT kind of personal assistant’),” it found.

“Their passivity, especially in the face of explicit abuse, reinforces sexist tropes,” it said.

Saniye Gülser Corat, UNESCO’s Director for Gender Equality, said much greater attention should be paid “to how, when and whether AI technologies are gendered and, crucially, who is gendering them.”

Earlier in May, Melinda Gates warned of a lack of diversity in the AI sector, saying that the number of women is “so small its unbelievable”. This had a serious impact on an increasingly influential industry, she said, explaining: “We are baking bias into the system by not having women have a seat at the table and not having people of color at the table.”

The study made the first official UN recommendations regarding AI personal assistants, urging companies and governments to end the practice of making digital assistants female by default.

It also suggested that bodies explore the possibility of making the voices “neither male nor female,” program assistants to discourage abusive or sexist language, and require them to “announce the technology as non-human at the outset of interactions with human users.”