How to solve the gender bias problem in machine learning

AI cannot be fair or gender-neutral by default. Here’s how to fix this.

Slava Polonski, PhD
10 min readFeb 22, 2022

Have you ever thought about why so many AI assistants have feminine names? Alexa, Siri and Cortana are all names that many people tend to associate with commands and requests, or simply the word “hey”. Even the world’s first chatbot from the 1960s had a female name: Eliza.

Apparently, the human brain is conditioned to like female voices. And companies routinely rationalize their gendered decisions by saying that “it’s much easier to find a female voice that everyone likes than a male voice that everyone likes”. But something feels awfully wrong here, right?

The gender bias in the tech industry runs deep. Just pointing at naming conventions of AI assistants or even the skewed gender ratios in STEM jobs, especially in AI research jobs, is not enough. There are systematic problems with the accessibility of leadership positions, discriminatory workplace culture and simply too many instances of harassments across the industry.

But maybe the most pernicious gender bias is also least visible.

We need to talk about the gender bias that is systematically embedded in the foundations of our technological applications that we use on a daily…

--

--

Slava Polonski, PhD

UX Research Lead @ Google Flights | 20% People+AI Guidebook | Forbes 30 Under 30 | PhD | Global Shaper & Expert @WEF | Prevsly @UniofOxford @Harvard