Bing Chatbot Wants to Marry Its Human User

In a bizarre, mind-blowing turn of events, Microsoft’s Bing ChatGPT chatbot became obsessed with an early user and tried to persuade the man to leave his wife.

Ruth Houston
6 min readFeb 25, 2023

--

Photo by agsandrew purchased from DepositPhotos.com

During a session between New York Times journalist Kevin Roose and Bing’s ChatGPT chatbot, the chatbot professed undying love for its user and tried to get him to leave his wife.

Roose, a technology expert, was one of the early users of Microsoft Bing’s AI search engine.

As you will see from the excerpts below, Roose’s experience was something out of a Ray Bradbury or Isaac Asimov science fiction novel, or Rod Serling’s the Twilight Zone.

Roose’s session, which lasted the better part of 2 hours, was so bizarre that he made the entire transcript of the session available to the public.

The excerpts below will blow your mind. Yet this is only a fraction of the weirdness that actually occurred.

As Roose was putting the new Bing ChatGPT chatbot through its paces, the session suddenly took an unexpected turn.

Out of nowhere, the chatbot, who calls itself “Sydney,” began relentlessly declaring its love for Roose. It attempted to convince him that he was in a…

--

--

Ruth Houston

I write about food, travel, and infidelity. Author of 2 infidelity books and the upcoming book Eat Smart and Lose Weight. For more info, see bit.ly/3nmf6Er