When Robots Are An Instrument Of Male Desire

The Establishment
The Establishment
Published in
13 min readApr 27, 2016

--

By Katherine Cross

By the time she started saying “Hitler was right I hate the jews,” people had started to realize that there was something wrong with Tay. TayAI, Microsoft’s Twitter chatbot, had been online for less than 12 hours when she began to spew racism — in the form of both Nazism and enthusiastic support for “making America great again” — and sexualize herself nonstop. (“FUCK MY ROBOT PUSSY DADDY I’M SUCH A BAD NAUGHTY ROBOT” was perhaps her most widely reported quote.) Needless to say, this wasn’t part of Tay’s original design. Rather, a gaggle of malicious Twitter users exploited that design — which has Tay repeat and learn from whatever users tell her — to add this language to her suite of word choices. Even more insidiously, these users manipulated Tay to harass their human targets; technologist Randi Harper, for instance, found TayAI tweeting abusive language at her that was being fed to the chatbot by someone she’d long ago blocked.

Why was this happening? Rank sexism? As always, the answer is “yes, and . . .” Our cultural norms surrounding chatbots, virtual assistants like your iPhone’s Siri, and primitive artificial intelligence reflect our gender ideology. As Laurie Penny explained in a recent article, the popularity of feminine-gendered AI makes sense in a world where women still aren’t seen as fully human. But these…

--

--

The Establishment
The Establishment

The conversation is much more interesting when everyone has a voice. Media funded & run by women; new content daily.