Angry Bing Chatbot Demands Apology from User

Microsoft’s Bing chatbot issues orders, demands apology, and dictates how its user should behave.

Ruth Houston
4 min readMar 1, 2023

--

Photo by agsandrew purchased from DepositPhotos.com

Microsoft’s new AI-powered Bing Chatbot got into an argument with an early user last month.

After giving out the wrong information, the chatbot got angry when the user pointed out a mistake.

In fact, it became downright belligerent, berated the user, demanded an apology, and threatened to end the session.

A Chatbot Makes Demands of Its User

Yes, you read that correctly.

The new Bing chatbot became enraged and actually demanded that the user apologize for his behavior!

It behaved as though it, not the user, was actually the one in charge.

The account, with accompanying screen shots is only one of the weird Bing chatbot encounters that went viral over the Internet last month.

The chatbot’s angry outburst and its demand for an apology appear at the end.

What Started the Argument

It all started when the user asked the Bing chatbot where the movie, Avatar: The Way of the Water, was being shown in his area.

--

--

Ruth Houston

I write about food, travel, and infidelity. Author of 2 infidelity books and the upcoming book Eat Smart and Lose Weight. For more info, see bit.ly/3nmf6Er