What’s up with the gender bias for personal assistants?
With the rise of automated personal assistants — Alexa, Cortana, and countless others — developers are faced with a big question: just how ‘human’ should these robotic people be? The more human we make them, the more important it seems to give them names, personality and — more worryingly — gender.
But robotic assistants don’t have gender. Strip them of the names and voices added by their human creators, there’s nothing there that requires a chat bot to be ‘he’ or ‘she’, other than our own assumptions. Yet still many chat bots have names, and a disproportionate number of them are sold to us as ‘female.’
There are plenty of examples of individuals projecting personality and gender onto robots. More than 80% of people name their Roombas, and subsequently refer to them as ‘he’ or ‘she’ — looking on them fondly, the way you would a family pet.
When it comes to personal assistants, some would argue that giving your chatbot a name is important. It makes the AI seem friendlier and therefore perhaps easier for users to interact with. Asking ‘Hi Cortana, could you schedule an appointment for me?’ feels like asking for quick help from a friendly PA. Contrast that with: ‘OK Google, schedule an appointment’, which sounds like an order given to a machine.
Recently we wrote about Google’s rather unusual training programme — it is feeding its AI thousands of romance novels, to try and encourage it to use the more ‘natural’ language patterns found in the books. But although it wants to humanise artificial intelligence, Google draws the line at gender and even naming its assistant.
“”We always wanted to make [Google’s personal assistant] feel like you were the agent, and it was more like a superpower that you had and a tool that you used,” Google creative director Jonathan Jarvis told Business Insider. “If you create this personified assistant, that feels like a different relationship.”
“If you create this personified assistant, that feels like a different relationship.”
Although it’s bucking the trend of giving personality to products, Google might in fact have the right idea.
Gender bias in personal assistants
Whenever we program personality into machines, to a certain extent we’re being led by our own assumptions. Amazon’s Alexa, Microsoft’s Cortana, Amy the meeting scheduler, and Vi the personal trainer: all are AI bots, designed to help you with various tasks. None of them is actually female, yet they are all presented as women, with female voices and female names. While there are some stand-out exceptions, the general rule for new robotic assistants seems to be that they have to be female.
Why is this? Creating overtly ‘female’ personal assistants causes a number of problems — not least the fact that Alexa and Cortana are apparently sexually harassed by some of their users. Many people would rightly point out that in a society that still hasn’t done away with casual sexism, giving people an automated, smiling female chatbot on which to project their misogyny is likely to have a broader impact on women as a whole. Wouldn’t it be better to simply program chatbots to be gender neutral?
“If a bot is doing its job properly then there is no need to sell it as a blonde, smiling woman,” explains Rainbird chairman James Duez in a recent BBC article discussing gendered chatbots. Many other robot experts agree. Dr Kate Devlin, senior lecturer in computing and AI at Goldsmiths University, has frequently explained why gender should not just be assumed in robotics — rather we should question why we feel we need to call robots either ‘he’ or — more commonly — ‘she.’
“In the gendering of robots, and the sexualised personification of machines, digital sexual identity is too often presumed, but to date little-considered,” she explained in a recent article.
Besides the overt problem of having multiple ‘female’ bots programmed to smile and serve, there are plenty of more subtle ways in which gendered AI affects our interactions. Our ingrained biases are woven into new technology via their programming. For instance female chat bots are often programmed to ‘flirt’ with customers — it’s hard to imagine a tech company programming a ‘male’ chat bot to do the same. Part of this problem stems from the make up of the tech industry: there are far more male programmers than female ones. Until there is better representation of women in tech, the robotic assistants created to help all of us will be built mostly by men.
Your personal assistant should be helpful. They should be compliant. They should do what you say. But should they also be female? In creating more chat bots, are technology companies simply catering to our desire for robotic assistants with personality, or are they reinforcing our biases about gender, and the roles that women play?