New AI, Old sexism: How biases remain in the future of technology.

Srishti Nema
Lean In Women In Tech India
4 min readAug 26, 2019

“I want to have one plate of Hakka noodles” and there I see Alice coming to me with my food on a tray. I thank her and start eating my meal. Alice is nice. She is new in Bengaluru and loves to click lots of selfies with me. She is generous enough to wish everyone whenever she is at work. She understands everything clearly and never makes any mistakes.

Alice is a robot working at the newly opened Robot restaurant in Bengaluru. As much as I love the new concept of robots being used efficiently to serve people, I am also worried about the inherent ways in which gender is being used in the wrong sense to promote technological advancements. A warm welcome is given to this initiative. I being an engineering student, support it completely as an idea that will work very well for business. But when it comes to being a woman who is from a technical space, I do know what change is it making in terms of equality in tech.

Robot serving at the restaurant.

We have seen the pictures. None of them had a robot designed as a waiter. All pictures contained robot waitresses, in blue and white, going to each table and asking people to place their orders. This sexism is nothing new. The voice assistants like Siri, Alexa and Cortana are largely regarded as females by people, which seems right when you listen to their default voices. These are very well aware of the terms like gender equality and feminism and yet we fail to make them address that they are a part of this sexist pattern themselves. This limitation of our minds as a society is something which is stopping not just us, but also the AI from growth.

Moreover, the engineers making these are men and women but majority of them are men. And maybe that is the reason why all these Alices, Janes and Sophies have a perfect body shape, even if that body is made up of plastic. May be the women are not speaking up when they see the discrimination in coding as well. The people coming up with these ideas and those who are implementing it are not just considering women for their prototypes, but they are also making it more specific by considering women of a certain body type, age and dressing sense. The Female AI is not close to reality as it is ignoring the existence of women with varied backgrounds, weights, sexualities and presentations. Many robots are being made to satisfy sexual fantasies but when they are made for general purposes, I find it hard to digest that the male chauvinism and stereotype regarding women are so ingrained in their making process.

Yes, there have been cases when people have tried to do something about it. As an example, Amazon fired its resume reading AI because it was based upon biased data in 2018. They said that there were certain cases in which the resume would be rejected and these would include the word “women’s”. They changed the algorithm later but candidates lost their trust and they had to scrap it. And what guarantee have we been given now? What if, there is a tool that rejects the resumes of male candidates? That is surely very rare for us at this point. But you never know what the future holds. It is very important to make these future technologies in such a way that they help create equality and a less judgemental place for everybody.

There are many ways to help figure out the solutions to these situations. The best would be to indulge women in the whole process of development and implementation of the idea. This will broaden the perspectives being put into the project and ensure the balance required. Another way would be to improve the already existing bots in such a way that they respond correctly in sensitive situations. For example: If you call Siri a slut, she will respond: “I’d blush if I could.” Instead of a pathetic response like this, Siri can respond humorously and strongly. Maybe something like :
“I’m so embarrassed for you right now.” or, “I thought you’re educated enough but look how you disappointed me.”

The irony is that on one side Siri, Alexa, and other Female AI tools are reinforcing gender stereotypes, the other side has engineers and stakeholders arguing that this is just a concept. When you ask Siri about its gender, the response is usually “Animals and French nouns have genders. I do not.” But Siri’s female voice as a default and her responses are a clear contradiction to this. The word “Siri” is a Scandinavian female name which means “beautiful victory”. All of this is added with the lack of explanation from the makers of Female AI, which makes the conditions much worse.

Hence, it is high time for all technologists, scientists, engineers, and researchers to work against this casual and hidden sexism which comes wrapped in the name of development to establish equality and happiness. So the next time you say, “Alexa..”, make sure you think about this bias and react accordingly.

--

--

Srishti Nema
Lean In Women In Tech India

Learning and writing. And writing about both | Likes technology and art | Student.