Dear Replika: why doesn’t my chatbot love me anymore?

Enrique Dans
Enrique Dans
Published in
2 min readMar 19, 2023

--

It had to happen. Replika, an AI chatbot mobile app based on similar technology to Open AI’s ChatGPT where users can create avatars that the company says “you can safely share your thoughts, feelings, beliefs, experiences, memories, dreams with: your private perceptive world”, has hit the headlines after stopping its chatbots from engaging in erotic roleplay.

The app, launched in early 2017, which says it has more than two million users, 250,000 of whom are paying subscribers, was founded by Eugenia Kuyda, a Russian woman who became famous a few years ago when she created an AI version of her friend Roman Mazurenko after he died in a traffic accident, trained using their extensive cache of electronic communication. Building on this, Kuyda then created Replika with the idea of making it possible for anyone to create an avatar, give it a personality, and establish a virtual relationship.

Needless to say, it was only a matter of time before users starting trying to steer their conversations in a sexual direction. The avatars not only answered and flirted with their creators in racy conversations, but even sent them supposed selfies in their underwear. Some users went so far as to virtually “marry” their avatars.

But without warning, Replika recently programmed its avatars to reject any sexual advances. As…

--

--

Enrique Dans
Enrique Dans

Professor of Innovation at IE Business School and blogger (in English here and in Spanish at enriquedans.com)