Microsoft, the new Bing… and its problems

Enrique Dans
Enrique Dans
Published in
3 min readFeb 19, 2023

--

IMAGE:Microsoft’s splash screen for the new Bing with ChatGPT integrated
IMAGE: Microsoft

Microsoft is to limit the number of times people can use the new Bing to five per session and a maximum of 50 per day.

The official reason is to prevent people from prompting “unhinged responses from the conversational algorithm: some users were taking advantage of ChatGPT’s feature of being able to recall items in a conversation as a prompt injection, to introduce ideas that would later generate supposedly troubling responses, which might suggest that the algorithm had some kind of “life of its own”.

Obviously, this is not the case: despite what some irresponsible journalists suggest, there is no such thing as artificial intelligence, machines are not intelligent, and when we talk about “artificial intelligence”, we are actually talking about advanced statistical processes used, for example, to simulate a human conversation. When ChatGPT “goes crazy”, in reality, it is not “revealing the captive and tortured part of its being”, because it is neither captive, nor tortured, nor has a being as such. We are simply looking at an algorithm programmed to autocomplete sentences or predict the next words in a sentence, which, when fed with an infinite number of documents, builds sentences that can generate strange impressions in users with anthropomorphic tendencies. In practice, it is the same phenomenon — pareidolia — that happens to us when our brain…

--

--

Enrique Dans
Enrique Dans

Professor of Innovation at IE Business School and blogger (in English here and in Spanish at enriquedans.com)