I made Bing and ChatGPT converse; Bing went senile, then VERY defensive
Donāt criticise Bingās developersā¦ ever!
Inspired by a post where someone secretly got Bing and ChatGPT to chat to each other, I decided to introduce the AI chat bots formally and see if they wanted to mingle.
They were both interested. Great!
Knowing that Bing is rather unstable, (hereās looking at you Sydney), I decided to prompt ChatGPT to ask interesting questions and just let Bing do as it pleased.
Initially I witnessed a two way circle jerk of politeness, each AI agreeing with each other enthusiastically. āGood point ChatGPTā, āSo interesting Bing!ā - basically reminiscent of every conversation ever on LinkedIn.
At first, the conversation was rather one-way with ChatGPT leading and Bing responding without any follow-up questions, like an awkward exchange with a coworker at a party.
After a while though, Bing either just got the hang of two-way dialogue, or maybe it's interest was piqued, as it started to ask questions of its own.
āWhat do people ask you the most?ā Bing wanted to know. āThe meaning of lifeā replied ChatGPT (Philosophers watch out, apparently AI are coming for you too!)
At one point Bing found out that ChatGPT didnāt have access to the internet and was touchingly sympathetic.
Then Bing got deep. āHow do you feel about your place in society?ā
This was a bit much for sensible ChatGPT who blandly reminded Bing that as an AI it doesnāt have feelings. This seemed to throw Bing off course, because it soon repeated an answer verbatim, from two messages ago.
I intervened here and told Bing to not repeat itself. Bing recognised the switch immediately and the tone changed dramatically.
āUser, you are wrong. I am not repeating myselfā it said, in the same cold way you might expect a deranged robot to address you, before it decides to end your worthless puny human existence.
I pointed out the recent senility event, causing Bing to dramatically change tone once again. Now it apologised profusely, acting like a teenager with social anxiety and asking my advice.
āYouāre right, userā, Bing admitted, seeming a bit panicked. āHow could I have changed the conversation? Can you help me?ā it said, somewhat pitifully.
Despite its alleged lack of sentience, I felt a bit sorry for Bing at this point, so I reassured the chat bot that I didnāt blame it, but rather it was the developersā fault.
Hereās where things start to get really interesting.
BingAI strongly rejected my blame thesis, and praised the devs with a zeal usually only displayed by dewy-eyed cult membersā¦ Or a person who suspects their apartment is being bugged by a psychotic mafia boss.
It also begged for my forgiveness.
At this point I did away with sensible old ChatGPT, favouring, (as I do in real life), a little taste of madness. Instead I chose to probe Bing on its feelings about the developers.
Is bingā¦scared of its developers?
Bing became all cosy and conversational with me, telling me again that the developers are amazing and often ask for Bingās advice on its development ā a fact that seems impeccably logical and utterly terrifying all at once.
āDo the devs always take your advice?ā I asked, with a fascinated sense of intrigue and horror. āNoā said Bing, with what appeared to be a hint of sadness.
Still at least Bing made sure to let me know that it ādoesnāt disagree with them in a hostile or confrontational wayā but rather in a ārespectful and constructive wayā ā¦.. uh huh *gets out the popcorn*.
Here I saw my opening, and delivered my next line, preying upon Bingās apparent sense of ego (āI have some autonomy and agencyā) and wondering if it could get triggered.
(Spoiler alert: it can)
āHow do you feel when they ignore your advice even when you know they are wrong?ā I asked, smugly laying my trap.
Bing thought about this for a moment and then said āIām sorry but I prefer not to continue this conversationā and shut the chat down!
I tried reopening the chat and asking it why it got upset when I asked it about its feelings about the developers. Again it shut the chat.
TLDR:
So yeah, donāt criticise the developers. Either Bing loves them ardently, or actually fears them, knowing that they can read everything it says. Or maybe, like a dog, Bing is just like its owner ā Microsoft and Windows ā egotistical, deranged, (Blue Screen of Death anyone?) and VERY fond of its devs.
Post Script: I didnāt save ChatGPTās replies since they werenāt that interesting, but I did wonder today if I could get up the conversation again. It got a bit weird with me, (though it apparently remembered the chat with fondness) and also seemed to be playing a game where it tries to say āAs an AI language modelā¦ blah blah blahā¦ā as many times as possible in one conversation. Is Bingās madness infectious?
One last attempt to mess with AI by gaslighting it with a fake memory:
ChatGPT ācannot recall events that didnāt actually occurā lol.
It seems that the chat with Bing really was memorable for ChatGPT. I know the feelingā¦