Unmasking AI: The Gender Bias Dilemma and Beyond

AI’s Secret Biases: Gender, Stereotypes, and the Way Forward

Varsha Sastry
AI Advances

--

Woman standing next to AI, illustrating the impact of gender bias in technology
Silent Prejudice: The Hidden Gender Bias in AI

Introduction

Welcome to the age of AI! Artificial intelligence is all around us, making our lives easier, funnier, and sometimes even spookier. But here’s a scoop: AI has a hidden problem that needs a closer look. It’s called gender bias, and it’s connected to two other curious ideas — feminization and domestication. In this blog, we’ll dive into these topics, taking cues from books like “Weapons of Math Destruction,” the eye-opening Netflix documentary “Coded Bias,” and a useful article from The Global Observatory.

Peeling Back Gender Bias in AI

AI, that computer brain behind so many cool gadgets, isn’t as neutral as it seems. It can be a bit biased, just like the rest of us. This bias, also known as “algorithmic bias,” is what happens when AI unintentionally takes sides.

In the book “Weapons of Math Destruction” by Cathy O’Neil teaches us that AI algorithms can carry forward unfair things like gender inequalities if the people who make them aren’t very diverse. When AI is trained on data that’s biased — like data that shows men getting more job offers than women — it will make unfair decisions too.

In the documentary “Coded Bias,” a striking revelation unfolds. It unveils how a relatively homogenous group of people has been driving the rapid development of AI and how their worldviews have inadvertently woven biases into the very fabric of code.

From Amazon’s recruiting tool, which may have unfairly excluded women candidates, to alarming discriminatory practices by landlords and banking institutions, to the misuse of code by police forces and government agencies, “Coded Bias” casts a wide net, pulling from a diverse array of real-world cases. These cases serve as compelling evidence for the need for stricter oversight and regulation of the algorithms that have become the silent architects of our lives.

This pivotal conversation has gained significant momentum, largely thanks to Joy Buolamwini, a research assistant at MIT’s Media Lab and the founder of the Algorithmic Justice League. Her groundbreaking discovery and subsequent studies exposed the biases in facial recognition software, particularly against darker-skinned individuals and women. Buolamwini’s work prompted some of the biggest tech giants, including Amazon and IBM, to reevaluate their practices and policies, highlighting the profound impact that one individual’s dedication to justice and equality can have on the technology landscape.

The article from The Global Observatory also chimes in, reminding us that gender bias in AI can lead to big problems. It often starts with the data used to train AI, which sometimes has unfair stuff in it. This can hurt women and gender-diverse folks when AI makes decisions, like hiring or in the justice system. It’s like AI is accidentally keeping unfair traditions alive.

The Feminisation and Domestication of AI

Past research shows how gendered divisions are naturalized and reproduced through technology. To begin with, technology often gets equated with “men’s power,” while women and girls are portrayed as less technologically skilled and less interested than their male counterparts. Such stereotypes can contribute to the gender gap in women’s participation in related fields.

Gender Stereotypes in Technology:

Gendered divisions in technology often perpetuate traditional stereotypes. Technology is frequently associated with “men’s power,” while women and girls are viewed as less tech-savvy and less interested in the field.

Feminisation of AI:

  1. AI is often feminized through attributes like voice, appearance, and female names or pronouns. Virtual assistants like Alexa, Cortana, and Siri have typically been given feminine qualities, reinforcing traditional gender roles.
  2. Male Voices for Authority: Interestingly, male voices are preferred for tasks involving teaching and instruction, seen as more authoritative and assertive, while female voices are associated with assistance and support.

Gender Bias in Occupations:
AI can perpetuate gender biases in occupational roles. “Male” robots are often linked to security-related jobs, whereas robots in sectors like hospitality and retail, predominantly employing women, have been “feminized.”

This is what we mean by “domestication” — AI gets stuck doing things that fit old-fashioned ideas about what women should do. But here’s the twist: AI has so much more potential! It doesn’t have to follow old rules; it can do great things in all sorts of fields.

UNESCO Recommendations to combat gender bias in applications using artificial intelligence

UNESCO, in collaboration with Germany and the EQUALS Skills Coalition, has undertaken a critical examination of the prevalent practice of projecting AI assistants, such as Amazon’s Alexa and Microsoft’s Cortana, as female in various aspects, including their names, voices, and personalities.

Their publication, titled “I’d Blush If I Could,” scrutinises this global phenomenon and highlights how Feminisation and Domestication of AI has effected women:

Reinforcing Gender Bias:

The practice perpetuates and amplifies existing gender biases, reflecting and spreading societal stereotypes.

Normalizes Harassment:

It inadvertently normalises sexual harassment and verbal abuse by creating AI systems that respond submissively.

Shapes Expectations:

It conveys messages about how women and girls are expected to respond to requests and express themselves, potentially reinforcing outdated gender norms.

Blames Women for Errors:

Often, these feminized AI systems become the “face” of glitches and errors resulting from hardware and software limitations, which are predominantly designed by men.

So, What’s the Plan?

Now, let’s talk about where we go from here:

  • AI is spreading fast, but we need some rules to make sure it plays fair.
  • We’re still not very good at dealing with gender bias in AI. We need some solid ideas and rules to make sure AI is fair to everyone.
  • Gender bias in AI can happen because of how data is collected, processed, and who makes the AI programs. We’ve got to keep an eye on all these things.
  • More women in STEM jobs (Science, Technology, Engineering, and Math) can help fight gender bias in AI. But we’ve got to make sure these women don’t face any unfair treatment, like workplace harassment.
  • AI can actually help fight gender inequality, like in hiring. We can also use it to understand how different people are affected by AI.
  • We should design AI with a “human-centered” approach, thinking about how it can help people and treat them fairly.
  • “Fundamental rights impact assessments” are a mouthful, but they’re a great way to spot and fix biases in AI.
  • Everyone needs to work together to fight gender bias in AI — businesses, tech companies, schools, and organizations.
  • The United Nations is onto something good with their idea of a Global Digital Compact in 2024. It’s a big step toward making AI more equal and fair.
  • To make AI truly great, it has to be equal for all. That means it doesn’t just work for some people, but for everyone, no matter their gender, race, where they come from, or how much money they have.

Conclusion

In conclusion, AI, as a tool created by humans, has inevitably inherited our biases, whether they are related to gender, race, or other aspects of our society. The big difference is how these biases in AI can deeply affect people’s lives. Unlike human errors, AI biases can perpetuate and amplify themselves, impacting decisions about jobs, loans, housing, and even law enforcement.

This realization serves as a stark reminder of the responsibility we bear in developing and deploying AI. It is not enough to design technology that merely mirrors our existing biases; we must actively work towards AI that corrects and compensates for these biases. The ongoing conversation about AI, bias, and fairness underscores the importance of ensuring that technology, as a product of our collective intelligence, remains a force for good, promoting equality, justice, and inclusivity in our rapidly evolving world.

--

--

Envisioning and Advocating for Design-Driven Social Change. Founder @earthlingsdesign