ChatGPT is Lying. Now what?

Helge Skrivervik
mindset3.org
Published in
5 min readFeb 19, 2023

--

Image © stock.adobe.com

How would you know if it’s lying? Most of us wouldn’t. Is that a problem? Only if you (blindly) believe what it’s throwing at you.

Which most people seem to do. At least if we’re to believe the media buzz, overflowing with ‘I asked ChatGPT about this and here’s what I got’ stories. From dating profiles to programming projects. Some fascinating because the results are really good, surprising, even impressive. Some scary because the user seems really naive — with an attitude almost like ‘ChatGPT said it, so it must be true’.

It isn’t — and that’s a big problem. Not so much with ChatGPT, it’s doing what it was programmed to do. But with us. Our expectations are totally off. We really need to cool down our relationship (more like blind enthusiasm?) with these bots, get an understanding of what they can and cannot do, and then reset our expectations. Maybe simply ‘make thinking great again’ (see also Chatbots aren’t the Problem, We are…).

When you get into a car, you don’t expect it to fly. You don’t have to be an academic to implicitly know the basic laws of nature. Not so with ChatGPT: Most of us have no idea what the limitations are. Actually, it’s programmed to deceive. You start off with a few simple questions, get really impressed, continue with more, get even more impressed and you’re hooked. By now you are ready to believe just about…

--

--