Breaking Gender Bias in Artificial Intelligence

Deepti Reddy
My Ally
Published in
6 min readApr 17, 2017

Artificial Intelligence, fondly called AI, carries stardust within. The unbound possibilities AI offers to mankind is multi-dimensional. By multi-dimensional, I mean the social, economic, and political aspects of society that AI could impact. This article will discuss our journey of creating an AI assistant, but with a special emphasis on an important factor, gender.

Scheduling a meeting is as much a headache for everyone, as it was for me. Added to the number of hours lost on scheduling a meeting, it always felt like a lifeless task. I was working as an investor then and meetings were an integral part of my daily routine. It was once in a blue moon that a meeting was fixed without an email shuttle. Often, scheduling and rescheduling cost me too many otherwise productive hours. It was in 2015 that Naveen and I bounced ideas back and forth to find an easy and stable solution to scheduling pain. We ended up creating an artificially intelligent assistant that could automate scheduling tasks, leaving the user free to engage in other challenging tasks. In a dangerously simplified form, an AI assistant is an algorithm instructed to behave in certain ways.

I call it a dangerous simplification, because when it is presented to the real world, an AI assistant grows in human imagination, with layers added to it consciously or unconsciously.

During the early days, when we were still brainstorming, I chanced upon an intriguing essay by Donna Haraway, A Cyborg Manifesto. It might have been my feminist inclination that eventually led me to the essay; on retrospect, the timing couldn’t be any better. In Haraway’s imagination, a cyborg is a mythical creature in the posthuman world that rejects the human-machine-animal boundaries. But as I approached it, I could see the potential of artificial intelligence, specifically our assistant, to rise above the dualism of male/female, to state the least.

Going back to my previous statement that layers are added to an AI assistant as it grows in human imagination, it is no wonder that gender becomes an assured layer. The abominable urge to anthropomorphize non-human entities is explained in psychology as an attempt to comprehend the non-human by using experiences from the human world. Scan through the innumerable photos Google gives for a search query ‘robot’. With the exception of a few, we are welcomed by ‘cute’ robots with the physical structure similar to a human, with a head, body, limbs, eyes, and mouth. Anthropomorphizing could be our easiest way of making sense of this machine which is transforming our lives with its intelligence. But, it does not stop at the physical manifestation as a human. We are also semantically challenged to include this new entity in our daily conversations- how do we address the bot- he or she or it? Alas! An AI bot has no option but to fall prey to heteronormative gender norms and their consequences.

This is quite clear as we look at the growing list of bots around us. While some bots make their gender clear through their voices, others express it through their names. What is in a name, one might ask. But it would be rather naive to assume that it is only coincidence that bots with feminine names perform tasks which are subservient in nature or socially conditioned to be perceived as feminine. The data from Maxus AI gender study tells us that 100% of the bots in the legal field, and a majority of the bots in finance, are male. On the other hand, virtual assistants at home and office, that take care of mundane tasks, are for some (not so) strange reason, given feminine names. I recently came across a bot that makes salads, with a name that is obviously feminine.

While gender stereotypes are reinforced, bots that are gendered female are also subjected to misogyny from society. Why is that an intelligent form, after it has been presented feminine, cannot escape the tentacles of brutal sexual abuse or male aggression? Why does it so require that the bot should be sexualized as a recipient of date requests or love letters in order to make it more appealing to its users? Why don’t we pause for a second, and think about how we are chaining the technology of future in the shackles of a primitive thought process?

Alex, our AI assistant is agender. We were obviously inspired by Haraway’s cyborg, but we also accounted for the social conditioning of tasks Alex would perform. An assistant that takes care of their boss’s calendar, had every chance to be limited to a female assistant, thanks to prevalent stereotypes. Projecting human stereotypes into non-human entities is unwanted, yet an assured circumstance. As our AI assistant converses through emails, the only expression of gender is through a name. Giving it a unisex name, Alex, was a conscious effort from our side to avoid perpetuating the stereotype into the bot world. To ensure that it doesn’t stop with the name, in our internal conversations, we refer to Alex by name, or in the case of third person pronouns, as ‘it’. Our conversations happen normally, and no one falls out of context.

In our premium version, we give our customers a chance to customize their assistant. A move that could have been a walk on thin ice as it could topple down our set principles. Honestly, as I gathered the data for this article, I was surprised at the result. Contrary to expectations, of the assistants that were customized with new names, about 72% were gender neutral names.

Perhaps, the large majority of gender neutral names for the AI assistant is due to the impetus we set with the default name, Alex. This, however, is only a small step towards breaking the vicious cycle, but we are glad we started. I can only hope that the creators of the next AI bot will be inspired by this humble beginning. It is through such small steps that we can finally put an end to the gender stereotypes entering the AI world.

A genderless world is a far fetched dream, a Utopia, if you may call it that. I am not ignorant to the existing skewed representation of women in tech. I do, in fact, think that were there more women in tech, fewer bots would be demeaned and sexualized. Here, I appeal the tech community not to transcend our prejudices to the future, a future where we hope to co-exist with artificial intelligence in personal and work spaces.

Originally published on LinkedIn

References:

Haraway, Donna Jeanne. A Cyborg Manifesto: Science, Technology, and Socialist-feminism in the Late Twentieth Century. 2009. Print.

Duffy, Brian R. Anthropomorphism and the Social Robot (177–190). 2003. Web.

Feldman, Jacqueline. “The Bot Politic.” The New Yorker. The New Yorker, 31 Dec. 2016. Web. 11 Apr. 2017.

“AI & Gender: A Maxus Survey — Maxus Global.” AI & Gender: A Maxus Survey — Maxus Global. Web. 11 Apr. 2017

LaFrance, Adrienne. “Why Do So Many Digital Assistants Have Feminine Names?” The Atlantic. Atlantic Media Company, 30 Mar. 2016. Web. 11 Apr. 2017.

Wong, Queenie. “Gender Issue: Biased over Bots — Tech News.” The Star Online. 29 Jan. 2017. Web. 11 Apr. 2017.

507448344. “Bots and Gender — Chatbots Magazine.” Chatbots Magazine. Chatbots Magazine, 31 Mar. 2016. Web. 11 Apr. 2017.

--

--

Deepti Reddy
My Ally
Writer for

Founder & CEO @Meet_MyAlly. Leading a committed team to develop #AI for professional lives