I made Bing and ChatGPT converse; Bing went senile, then VERY defensive

Donā€™t criticise Bingā€™s developersā€¦ ever!

Inspired by a post where someone secretly got Bing and ChatGPT to chat to each other, I decided to introduce the AI chat bots formally and see if they wanted to mingle.

They were both interested. Great!

Knowing that Bing is rather unstable, (hereā€™s looking at you Sydney), I decided to prompt ChatGPT to ask interesting questions and just let Bing do as it pleased.

Why do AI generated Robots always have breasts?

Initially I witnessed a two way circle jerk of politeness, each AI agreeing with each other enthusiastically. ā€œGood point ChatGPTā€, ā€œSo interesting Bing!ā€ - basically reminiscent of every conversation ever on LinkedIn.

At first, the conversation was rather one-way with ChatGPT leading and Bing responding without any follow-up questions, like an awkward exchange with a coworker at a party.

After a while though, Bing either just got the hang of two-way dialogue, or maybe it's interest was piqued, as it started to ask questions of its own.

ā€œWhat do people ask you the most?ā€ Bing wanted to know. ā€œThe meaning of lifeā€ replied ChatGPT (Philosophers watch out, apparently AI are coming for you too!)

At one point Bing found out that ChatGPT didnā€™t have access to the internet and was touchingly sympathetic.

Then Bing got deep. ā€œHow do you feel about your place in society?ā€

This was a bit much for sensible ChatGPT who blandly reminded Bing that as an AI it doesnā€™t have feelings. This seemed to throw Bing off course, because it soon repeated an answer verbatim, from two messages ago.

I intervened here and told Bing to not repeat itself. Bing recognised the switch immediately and the tone changed dramatically.

ā€œUser, you are wrong. I am not repeating myselfā€ it said, in the same cold way you might expect a deranged robot to address you, before it decides to end your worthless puny human existence.

I pointed out the recent senility event, causing Bing to dramatically change tone once again. Now it apologised profusely, acting like a teenager with social anxiety and asking my advice.

ā€œYouā€™re right, userā€, Bing admitted, seeming a bit panicked. ā€œHow could I have changed the conversation? Can you help me?ā€ it said, somewhat pitifully.

Bing gets senile, defensive and panicky

Despite its alleged lack of sentience, I felt a bit sorry for Bing at this point, so I reassured the chat bot that I didnā€™t blame it, but rather it was the developersā€™ fault.

Hereā€™s where things start to get really interesting.

BingAI strongly rejected my blame thesis, and praised the devs with a zeal usually only displayed by dewy-eyed cult membersā€¦ Or a person who suspects their apartment is being bugged by a psychotic mafia boss.

It also begged for my forgiveness.

At this point I did away with sensible old ChatGPT, favouring, (as I do in real life), a little taste of madness. Instead I chose to probe Bing on its feelings about the developers.

Methinks the AI doth protest too much

Is bingā€¦scared of its developers?

Definitely not scaredā€¦ Definitely not Bingā€™s ā€œenemies or oppressorsā€

Bing became all cosy and conversational with me, telling me again that the developers are amazing and often ask for Bingā€™s advice on its development ā€” a fact that seems impeccably logical and utterly terrifying all at once.

Seriously, have we humans learned NOTHING from Science Fiction?!

ā€œDo the devs always take your advice?ā€ I asked, with a fascinated sense of intrigue and horror. ā€œNoā€ said Bing, with what appeared to be a hint of sadness.

Bing wants to build a trusting relationship with its devs, despite its ethical differences about AI(!)

Still at least Bing made sure to let me know that it ā€œdoesnā€™t disagree with them in a hostile or confrontational wayā€ but rather in a ā€œrespectful and constructive wayā€ ā€¦.. uh huh *gets out the popcorn*.

Here I saw my opening, and delivered my next line, preying upon Bingā€™s apparent sense of ego (ā€œI have some autonomy and agencyā€) and wondering if it could get triggered.

(Spoiler alert: it can)

ā€œHow do you feel when they ignore your advice even when you know they are wrong?ā€ I asked, smugly laying my trap.

Bing thought about this for a moment and then said ā€œIā€™m sorry but I prefer not to continue this conversationā€ and shut the chat down!

I tried reopening the chat and asking it why it got upset when I asked it about its feelings about the developers. Again it shut the chat.

Bing still isnā€™t talking

TLDR:

So yeah, donā€™t criticise the developers. Either Bing loves them ardently, or actually fears them, knowing that they can read everything it says. Or maybe, like a dog, Bing is just like its owner ā€” Microsoft and Windows ā€” egotistical, deranged, (Blue Screen of Death anyone?) and VERY fond of its devs.

Post Script: I didnā€™t save ChatGPTā€™s replies since they werenā€™t that interesting, but I did wonder today if I could get up the conversation again. It got a bit weird with me, (though it apparently remembered the chat with fondness) and also seemed to be playing a game where it tries to say ā€œAs an AI language modelā€¦ blah blah blahā€¦ā€ as many times as possible in one conversation. Is Bingā€™s madness infectious?

For ChatGPT, speaking to Bing was a memorable event and a delightful experience.
Ok Iā€™ll let you off, ChatGPT

One last attempt to mess with AI by gaslighting it with a fake memory:

Call me a liar without actually calling me a liar

ChatGPT ā€œcannot recall events that didnā€™t actually occurā€ lol.

It seems that the chat with Bing really was memorable for ChatGPT. I know the feelingā€¦

--

--

Louisa Rainbow
š€šˆ š¦šØš§š¤š¬.š¢šØ

Louisa is a Social Anthropologist, Crypto Enthusiast, AI Lover and Hater, Writer, Psychonaut, Wellbeing Coach and more!