A multiple armed woman, distressed, performing many tasks
Image credit BunnyStudios

How AI Voice Assistants Reinforce Dangerous Gender Stereotypes

Rheva Wolf
Encode Justice
Published in
4 min readFeb 9, 2021

--

By Rheva Wolf

Artificial intelligence-powered voice assistants are showing up more and more in our day to day lives. Customer service robots exist in hotels, restaurants, bars, security, child care, and even some grocery stores. Although AI can often make our lives easier, it introduces numerous problems and can perpetuate stereotypes and hold prejudices.

Studies have shown that facial recognition and artificial intelligence (AI) bias disproportionately affects women and people of color. But what most people aren’t aware of are the gender stereotypes that everyday voice assistants present.

When iPhone users give Siri some sexually explicit commands, she responds with, “I’d blush if I could.” A 2019 UNESCO publication by this name was one of the first to delve into the harmful gender stereotypes that Conversational Voice Assistants (CVAs) like Siri, Alexa, Cortana, and Google Assistant reinforce. Apple has since updated Siri’s reply to a flatter, “I don’t know how to respond to that.” But the biases and stereotypes don’t go away by changing one line of code.

Most CVAs, when sexually harassed, flirt, avoid, or even thank the user. In February of 2017, Leah Fesler tested assistants for reactions to sexual harassment from humans. They responded with “Well, thanks for the feedback,” “Well, I never!” or, “There’s no need for that…” Since then, some of these programs have changed to a less flirtatious response, but all are still far from the much needed, “That sounds like sexual harassment. That’s not okay.”

Siri’s response to commands like these is not the only place we see gender-based stereotypes enforced by artificial intelligence. The first and most obvious shortcoming of most CVAs is gender. Although Google’s voice bot, Google Assistant claims that it “eats gender roles for breakfast,” 92.4% of all voice assistants default to a female sounding voice. The tasks that these bots carry out are often thought of as traditional women’s work. Society teaches children to treat voice bots as “unquestioning helpers who exist only to serve owners unconditionally.” These expectations reinforce outdated stereotypes and perpetuate harmful gender norms when these voice bots are female (Statt, 2019).

On the occasion that chat boxes or voice assistants default to a male, name, picture, or voice, you can usually find them on a law firm, accounting, or business based website delivering analytical advice and information. Chat boxes and voice assistants that default to female genders are almost always executing secretarial tasks. These tasks might include customer service, retrieving information, or doing other helpful things (New York Times).

According to Josie Young, “Assigning gender to a voice bot is poor design. It’s boring and lazy. Companies claim they are incorporating ‘progressive ideas.’ The ideas that women are tools who serve others and tolerate abuse.” The more that we associate assistants with women, the more society will see women as servants who must always be sympathetic, helpful, and eager to please.

However, most often, gender isn’t assigned to bots with mal-intent. Instead, it’s the result of wildly non-diverse teams of almost always cis-gendered and white males (Chin, Robinson, 2020). Women make up only 12 percent of AI researchers and a staggering 6 percent of software developers, as they’re not welcomed into the workforce and are often pushed out of the industry.

“It’s not always malicious bias, it’s unconscious bias, and lack of awareness that this unconscious bias exists, so it’s perpetuated,” said Allison Gardner, a co-founder of Women Leading in A.I. “But these mistakes happen because you do not have the diverse teams and the diversity of thought and innovation to spot the obvious problems in place. The whole structure of the subject area of computer science has been designed to be male-centric, right down to the very semantics we use.”

Not only does assigning a human gender to conversational voice assistants enforce dangerous stereotypes, but it also limits a bold future for voice bots. A future where they can exceed human capabilities. It constrains what society can see as possible for CVAs and artificial intelligence.

The solution to these problems is simple. Incorporate women into software development teams and give them equitable opportunities to ensure that modern-day voice assistants can manage sexual harassment, gender bias, and toxic gender stereotypes.

Chin, Caitlin, and Mishaela Robison. “How AI Bots and Voice Assistants Reinforce Gender Bias.” Brookings, Brookings, 23 Nov. 2020, www.brookings.edu/research/how-ai-bots-and-voice-assistants-reinforce-gender-bias/.

Young, Josie. “Why We Need to Design Feminist AI.” TED, TedX, 2020, www.ted.com/talks/josie_young_why_we_need_to_design_feminist_ai/up-next.

Statt, Nick. “AI Voice Assistants Reinforce Harmful Gender Stereotypes, New UN Report Says.” The Verge, The Verge, 21 May 2019, www.theverge.com/2019/5/21/18634322/amazon-alexa-apple-siri-female-voice-assistants-harmful-gender-stereotypes-new-study.

Equal Skills Coalition, UNESCO. Unesdoc.unesco.org, 2019, unesdoc.unesco.org/ark:/48223/pf0000367416.page=1.

Specia, Megan. “Siri and Alexa Reinforce Gender Bias, U.N. Finds.” The New York Times, The New York Times, 22 May 2019, www.nytimes.com/2019/05/22/world/siri-alexa-ai-gender-bias.html.

Curry, Amanda Cercas, et al. “Conversational Assistants and Gender Stereotypes: Public Perceptions and Desiderata for Voice Personas.” EPSRC, 2020. https://www.aclweb.org/anthology/2020.gebnlp-1.7.pdf

--

--