More than a decade ago when the first iPhone with Siri was released to the market, it didn’t take long for user guides to offer advice on how to control Siri. One in particular includes instructions to “sweet-talk Siri so she behaves the way you want her to.” Today, users on forums are not shy in sharing their screen shots of explicitly sexual, crude, and sometimes violent interactions with Siri.
Microsoft is so aware of the sexual harassment endured by Cortana that they have attempted to script responses into the programming to counter this type of user behavior. As Digital Virtual Assistants (DVAs) are used by millions globally on a daily basis, sociologists among others are asking questions about how this kind of treatment of them will impact us in the real world.
“Sophia” and “Erica” are two recently created life like robots, continuing the trend of feminizing AI that DVAs have trained us to expect. While more in the prototype stages than ready for the consumer market, these androids give us a glimpse into a possible future where robots will live among us. Both researchers who created Sophia and Erica say they were created as part of research to develop companions for an aging population. Like DVAs, it seems that these female robots are designed to take on tasks that humans would rather not do. Like Siri, Cortana, and Alexa: without risk of independent thinking or action, these artificial females can only be in service, and enslaved to, her owner.
“Harmony” is the world’s first sex robot. Unlike Sophia and Erica, Harmony will reportedly be available soon to the U.S. consumer market, and for about $20,000 one can customize nearly every aspect of her: body, face, hair, skin color, even voice and personality. While “sex dolls” are not necessarily new, one that talks and responds to a human voice is. Kate Devlin, who studies human computer interaction, explained her take on “sex-dolls” here on NPR’s Hidden Brain podcast episode, and how she has some empathy for owners of the dolls — 90% of whom are men. Devlin suggests there is a strong possibility that these simulated women may be the only relationship outlet her male owner has.
Build better voice apps. Get more articles & interviews from voice technology experts at voicetechpodcast.com
Yet others like Veronica Cassidy see the entire lifestyle as a “technologized misogynistic nightmare.” It is hard not to see that point, given the “relationship” scenario most often is an exaggerated version of stereotypical traditional heterosexual gender roles, wherein the female of the dyad again lacks agency and is hypersexualized. The dolls are literally made-to-order, as Cassidy says: “a product of a dominant white male culture” which “embod[ies] its most rigid conventions.” Kathleen Richardson, who heads the Campaign against Sex Robots, maintains that the core issue is the ownership by design and therefore, similar to prostitution and pornography, imbues a lack of empathy by both the manufacturer and owners. Women, whether real or unreal, are seen and treated as objects and possessions ready to meet the needs of her owner wherever and whenever he chooses, she says.
Limited empirical research exists on human computer interaction with this type of AI that is strikingly close to human-like form. At least one study on virtual relationships, however — that with a virtual (animated) partner in the game Second Life — showed a negative impact on players’ real-life relationships in correlation to their perception of realism in the virtual relationship. In Japan where virtual relationships are on the rise, rates of dating and marriage has declined, though it is not clear what the causes and effects are. The concern of sociologists is not what people do in their private lives in their own homes, but rather the unreal expectations set up by these simulated women and virtual relationships.
“The proposal is that males can be gratified exclusively in the way they desire without any concern for reciprocity and a mutual empathetic relationship. This logic only makes sense if someone believes humans are things, and if they think instrumental relationships between persons are positive with no resulting impact on social relations between persons” — Kathleen Richardson, 2016.
While “sex robots” aren’t available commercially just yet, there have been reports of violent abuse of their predecessor, human looking sex dolls. Some (male) users have commented on social media forums that they believe “doll abuse” should be viewed as an alternative outlet to real-world violence against women, however, psychologist Dr. Tucker believes it could lead to violent behaviors in the real world. Either way, according to Cassidy, the premise that society “would accept violence against women as inevitable, necessitating “healthy” outlets” is concerning, especially when that outlet is a proxy which looks so close to a (woman) human. There are mixed conclusions from research on whether violence in media and online games predicts that in the real world, however there is evidence that behaviors and knowledge gained from interactions in a virtual world, such as Second Life, can translate to the real world.
If “unreal” relationships set up expectations that are not possible (or rather, would not be humane) in the real world, what problems for society do they create? This may be ever more critical to understand in the scenario of a human-like AI “sex robot.” When humans interact in these one-sided, controllable “relationships” with AI, it seems plausible the behaviors of the “owner” can carry over to real world human relationships. This is particularly concerning where there is already a real world power inequity between genders.
The above content is an adaptation from my original thesis, “Societal Implications of Gendering AI;” available in full text at https://www.researchgate.net/publication/334945581_Societal_Implications_of_Gendering_AI