Alexa And Apple’s Siri Promote Sexism, UN Report Finds

Updated on

Have you ever wondered why most virtual assistants powered by artificial intelligence, such as Apple’s Siri and Amazon’s Alexa, use a female voice by default. After giving it some thought, the United Nations concluded that such practices reinforce harmful gender stereotypes and that virtual assistants such as Amazon’s Alexa and Apple’s Siri promote sexism.

Do Amazon’s Alexa and Apple’s Siri promote sexism?

In a report titled “I’d blush if I could,” the UN said the use of a female voice supports the gender stereotype that women are subservient. The report studies the effect of bias in AI research and product development and the potential negative implications of such trends in the long-term.

This report authored by UNESCO (United Nations Educational, Scientific, and Cultural Organization) said that since most voice assistants are female by default, it seems to suggest that women are “docile helpers.” Another concern raised by the UN report is that when insulted, these voice assistants give “deflecting, lackluster or apologetic responses,” which the agency believes underlines the gender bias that women are submissive and overlook the abuse.

“Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built AI systems that cause their feminized digital assistants to greet verbal abuse with catch-me-if-you-can flirtation,” the report notes.

Further, the report says voice assistants don’t hold any power beyond what they have been commanded to do. Moreover, they just honor the commands and replies to queries irrespective of the tone of the user. Moreover, the UN suggests that such gender bias could increase in the future, considering that voice assistants are expected become the primary communication mode between the hardware and the software, and we will be surrounded by Internet-connected gadgets.

What’s going wrong?

According to the report, tech companies have not installed proper safeguards against abusive, hostile and gender-biased language in their digital assistant. For instance, most of the voice assistants try to avoid aggression by making a sly joke.

The UN report calls on tech companies to program their digital assistants in a way that discourages gender bias. Specifically, the organization asks companies to stop using the female voice as the default and encourage more participation of women in the field of artificial intelligence.

The UN argues that its findings are a broader reflection of the gender disparities in tech and the AI field. According to the report, women represent just 12% of the AI workforce and 6% of developers in the field. Although women now have more opportunities in the field of computer science, many reportedly leave the field as they progress in their career, a trend referred to as “leaky pipeline.”

“I would say they are actually being forced out by a rather female-unfriendly environment and culture,” Women Leading in A.I. co-founder Allison Gardner said, according to The New York Times. “It’s the culture that needs to change.”

Are tech companies to blame?

Tech companies have their own reasoning for using the female voice as default. Business Insider reported last year that Amazon opted for a female voice because market research suggested it would be more “sympathetic” and thus, seem more helpful.

Microsoft claims it used Cortana in an attempt to benefit from the popularity of a similar AI character in its Halo video game franchise. For now the company does not allow users to change Cortana’s voice to a male. There is also no information on if or when it plans to add such an option.

As far as Siri, it is a popular Scandinavian name for females and means “beautiful victory.”

This all suggests that the decisions of tech companies regarding the name and default voice of their digital assistants is based on extensive research and feedback. Thus, it doesn’t seem that tech companies are making no efforts to break away from such gender stereotypes.

For instance, Google represents its Assistant voice options (male voices with accents and female options) in the form of colors. The search giant no longer allows users to select a male or female voice; rather, users have to select the color they want. Each color is randomly assigned to one of the voice options.

Google also came up with an initiative called Pretty Please. Under this, young children get rewards for using phrases like “please” and “thank you” when talking to Google Assistant. Amazon =introduced a similar initiative to encourage polite behavior when talking to Alexa.

As of now, there are no comments from Apple, Amazon or any other tech companies on the UN report.

Leave a Comment