Hey Siri, Why Are Most Digital Assistants Female?

How gender bias in AI perpetuates harmful stereotypes about women

Ally Bush
Fourth Wave

--

Photo by Omid Armin on Unsplash

When my mom got her first iPhone, my brother and I were ecstatic. Suddenly, we didn’t need to go to the family computer to type in what we wanted to know on Google. We could just ask Siri, Apple’s AI assistant with a distinctly female voice.

We spent hours asking her everything under the sun: How many miles is it to the moon? What’s 5,678 divided by 1,234? Is it going to be sunny tomorrow?

Our mom was probably relieved; we weren’t asking her those questions anymore.

Siri was perfect: she was, and still is, the always subservient, helpfully docile in-house assistant. As is Alexa. And Cortana. And the on-hold voice you hear when you call your bank or doctor’s office. And most chatbots. And a majority of the rest of the digital assistants we interact with on a daily basis.

It’s no coincidence that AI assistants skew predominately female. Gender bias is rampant in AI technology, and as ChatGPT and Dall-e and other AI software become more and more accessible, so too do opportunities to reinforce gender stereotypes and discrimination.

Female voices and centuries of data

--

--

Ally Bush
Fourth Wave

Word enthusiast, tummyache survivor, aspiring crochet maestro.