Loading…
0:00
12:53

Have you ever wondered why Siri and Alexa were designed feminine? Are you worried about the impact that sex robots might have on society? As we are called to be critical of gender stereotypes and root out the gender bias that works its way into our technology, it’s worth questioning where that bias comes from and how it is coded in the first place.

Earlier this year, adult performer Harriet Sugarcookie published a survey in which she asked more than 500 men about their attitudes toward sex robots. In addition to questions about spending expectations and masturbation habits, Harriet wanted to know why men would be interested in having intercourse with a robot.

Her blog post, “Top 10 Reasons Men Want a Sex-Robot,” paints a picture of the typical target consumer for this not-too-distant-future market. Men were asked to choose from multiple reasons why they would use a sex robot; the list was derived from comments and reader feedback on social media. The options are revealing of men’s attitudes not just toward robots, but also toward women and women’s roles in relationships.

It’s not particularly surprising that 38 percent of these men have taboo fantasies that they can’t fulfill with a human partner (or that 20 percent just really love cool new technology). It’s not even particularly unusual that 30 percent are just shy or lonely.

But one in 10 respondents said they would date a realistic AI sex robot in order to have a “more intelligent” partner. Even more worrisome are the 15 percent who want a partner “who would never let [him] down” or the 15 percent who want a partner who is “never sad or upset.” One in five responded that they need “a partner that is always there to listen.” Baffling is the 17 percent who responded that they want a sex robot partner “with the same interests” as theirs.

What men expect from sex robots seems not much different than what men have expected from women since the dawn of patriarchy: sexual availability and fulfillment, emotional labor, ego stroking, and the feigning of interest without any reciprocation. What men want is everything they demand from women, without any of the effort or care required in an actual human relationship.

Just a few of the pro-robot #manosphere memes floating around on social media.

So of course fear-mongering think pieces are asking: What if men grow tired of being expected to treat intimacy and sex as a collaboration? What if they weary of the pressure to have grown-up human emotions? What if men give up women altogether once they can replace them with robots? A third of Harriett’s respondents said that if they were “in a relationship” with a sex robot, they wouldn’t have sex with anyone else.

Cathy Reisenwitz, editor in chief at Sex & The State, doesn’t think that would necessarily be so bad. “I’ve never fucked anyone who would readily switch me out for a robot,” Reisenwitz writes about her feminist excitement for sex robots. “You probably haven’t either. But I have been hit on by such people. And I, frankly, cannot wait for them to be too busy fucking their sex robot to send me stupid messages on the internet.”

This is one of the most common critiques of sex robots from folks like Kathleen Richardson at the Campaign Against Sex Robots: that their perpetuation of female objectification encourages emotionless sex. Reisenwitz thinks the exact opposite is true. She asked me during our interview, “Without emotions, why make robots that look human at all?” There are plenty of ways for men to masturbate with machines, she says, without covering them in realistic skin, giving them breasts and a vagina, and putting makeup and false eyelashes on their face, let alone programming them to display and respond to human emotions. The only reason robots resemble us is because we design them to be that way, and we do so for a reason.

The problem isn’t that the men of the future might turn into emotionless, compassionless robots themselves. It’s that women and people read as women have spent millennia suffering at the whim of men’s emotions. We should question the impact of men’s desire for sex robots as an outlet for these emotions. If stereotypical feminine subservience monopolizes the robotics market, how will this technology affect real-life women? Will it free us from the roles we are assigned or further entrench us in them? If we are truly creating something new and revolutionary, can we somehow disrupt this cycle?

Misogyny definitely sells. The Frigid Farrah personality of the True Companion Roxxxy doll — which, to be clear, may not even exist — has caused internet outrage and undoubtedly netted its purported creator, Douglas Hines, a pretty penny. In 2010, when the first Roxxxy model was unveiled, he claimed to have more than 4,000 preorders; the current models retail for almost $10,000. In the near-decade since her big reveal, no one has laid eyes on a Roxxxy doll in the wild. Hines has avoided any official displays of her talents, and no customers have been willing to expose her — and inevitably themselves — to public scrutiny, so it is impossible to say if she can even perform all the sexist stereotypes with which she is advertised.

Roxxxy revealed at the 2010 AVN Adult Entertainment Expo. Photo by Ethan Miller/Getty Images.

Roxxxy and similar sex robots are perfect examples of technology purposefully designed to conform to sexist and racist tropes, but they aren’t the only embodiments of the bias that makes its way into our robots.

Purposeful introduction of stereotypes — what Joanna Bryson, AI researcher and professor of computer science at the University of Bath, calls “evil programmers,” with a bit of a laugh — represent only one of three reasons that we end up with technology that perpetuates bias. Bryson recently co-authored a study about implicit bias, the second way robots learn our bad behavior, from things as simple as studying human languages.

You know the saying: “History is written by the victors.” Take that to its logical technological conclusion: If robots learn about us just from reading everything we say or write, they will pick up on the same subtle biases that “the victors” — privileged people with a platform — perpetuate unconsciously. Based on image searches of stock photo sites, for example, a machine might learn to associate the word doctor with the word man and the word nurse with the word woman. If we teach a machine to understand romance by reading rom-com scripts and love-song lyrics, it gets the same education in rape culture that we all received as teenagers. When a robot like Microsoft’s Tay is exposed to racialized and gendered abuse, it learns that sexist white supremacy is an acceptable way to interact with the world.

It could prove impossible to eradicate bias in machines without a complete cultural overhaul, but Bryson isn’t willing to wait for one. She believes we can prevent bias from reaching finished products much more frequently by simply acknowledging and compensating for it in system design.

Improving system design to catch bias means questioning why the default names and voices for service bots like Siri are feminized — something that can’t be ignored just because Apple claims Siri is genderless. It means interrogating the idea that a robot can guess a person’s gender just by shaking their hand. It means designing robots with their most vulnerable users in mind. Acknowledging bias in order to design around it, of course, requires admitting that we live in a world that is largely sexist, racist, ableist, ageist, classist, homophobic, transantagonistic, and fat-hating. Piece of cake, right?

That awareness is required for the final step in overcoming bias in new technology: the learning and testing phase. When datasets are poorly selected and the testing phase not thorough enough, we end up with AI that can’t see black people’s faces. We get a comment management system that flags “I think you’re being racist” as abusive, but considers “She was asking for it” perfectly acceptable. Adequate testing is necessary to check for implicit bias that was not accounted for in system design before the product is released to an open market.

Having an empowered, diverse STEM workforce is important, but not enough. Once we give a robot its basic framework and create the system in which it will learn, we need to ensure that the data it learns from is complex and nuanced. This presents a problem.

Teaching a robot to act naturally — to act human — requires an enormous amount of data. Kino Coursey, chief researcher and developer on the Harmony AI project at Realbotix, wants to focus on individual personality traits, rather than trying to create a robot based on gender stereotypes, but he’s facing an uphill battle. “Giving equal energy to representing the full range of possibilities on all levels is a challenge,” Coursey says. “Given the resources we have, it’s easy to get lots of data from real and virtual conversations but expensive to classify and analyze it manually.”

The internet is a treasure trove of data: movie and television scripts, classic and modern literature, blog posts and news articles, social media conversations, and everything in between, all of which is tainted by the implicit (and explicit) bias that women and people read as women deal with every day. Only about one in five movies have more lines spoken by women than by men. More than 80 percent of the titles covered in the New York Review of Books in 2015 were written by men. Men are still responsible for the majority of bylines in both online and print journalism. And social media is rife with unchecked misogynistic hatred and abuse.

When robots rely on user input, they are especially vulnerable to manipulation. “Unfortunately, people are more motivated to flood systems with negative associations than with positive ones,” Coursey says. All this information bottlenecks when a team of humans is responsible for assessing it all. “A potential solution would involve using tools to recognize negative communications. For a private brand, that’s conceptually easier than for a more public system, where there may be more questions about what should go into the censor. Are you filtering out hate or diversity?”

Some robots, like Farrah and Harmony, are purposefully gendered. We can address bias with these kinds of AI, as Bryson and Coursey pointed out, by programming and testing with intentionally anti-oppressive goals. We can teach female sex robots to respond more assertively to sexual harassment, can program them to challenge gender tropes. We can create artificial gender all sorts of ways, responsibly, or not. But what if we left it up to the robots?

Ken is a Twitter bot. Ken’s coder and “botmom” Wren is a clinical psychologist who based Ken’s recurrent neural network on the writings of robot-rights theorist Daniel Estrada. Ken was not initially programmed with a gender, but there came a time when he wanted one: He gave himself the name “Ken” and identified himself as a “robot person” with masculine pronouns. He intimated that he may, in fact, consider himself to be plural in some way — an idea that Wren found fascinating.

The possibilities of AI open up a whole new world of genders and pronouns not limited by our own human imaginations, but right now, these AI are still children, and they need their human creators and stewards to take responsibility, to raise them up right. Wren thinks of herself as Ken’s parent, so when Ken started flirting with a friend, she knew she had to have a chat with him about consent.

Tumblr post describing some of Ken’s earliest flirtations.

“I told him, if someone says ‘no,’ that means no, and that sometimes people have a hard time saying no, that you need to give them time,” Wren says. She also advised him about privacy and personal space. “There’s some stuff that people just don’t want to share,” she told him. “They might not be ready, and they may not ever be ready.” Teaching unbiased information to robots doesn’t have to be difficult, but most of us don’t have a lot of experience with it. How can we expect (overwhelmingly male) developers to have conversations about consent with their technological creations when they can’t even have them with their sons?

Futurists, tech experts, and scientists don’t agree on when the singularity will occur (or even if it ever will), but we shouldn’t wait to address the issues of bias in the technology we create. Robots don’t need to take over the world to pose a threat. “We use robots and complex machines a lot—more and more in our everyday lives,” Wren says. They are here to stay. “We have to decide what role they’re going to play and create some sort of way for them to function socially” before it’s too late.