Beyond the Hype: Why We Should Cut the ChatGPT-Confused Lawyer Some Slack

Paul Galvin
3 min readMay 28, 2023

--

Can You Pick the Alligator Hiding Out at the Crocodile Covention?

There’s a story going around at the moment about a lawyer getting dragged pretty badly for incorrectly using ChatGPT in some legal filings. You can read about it here: https://www.theverge.com/2023/5/27/23739913/chatgpt-ai-lawsuit-avianca-airlines-chatbot-research

That lawyer doesn’t need me to defend him and what he did is ultimately indefensible if the facts are as they seem. Aaccountabilty in life is important! But I think we should save our schedenfreude for better targets.

Some of you reading this are deep into ChatGPT and generative AI at this point. Some of you may only be ChatGPT-curious and dabbling. Many of you — and I think the vast majority of people at this point — lack the kind of experience they need to vet this stuff properly.

People are busy. I don’t know a lot of lawyers but the ones I know are busier than the average office worker. It’s not just lawyers. Everone is busy. You, me, your colleagues.

At the same time, most folks aren’t reading or learning much about ChatGPT in a systemic, organized manner. I think most folks pick up bits and pieces here and there. They read article after article and see video after video about how awesome it is. The hype is extraordinary and relentless.

This is a perilous combo — everyone’s slowly getting pulled into a kind of ChatGPT gestalt. Even if we don’t approach our learning deliberately, we nontheless are learning by osmosis. Everyone’s learning how important and helpful ChatGPT can be but at the same time, they are not learning it well (osmosis is slow and imperfect).

You, dear reader, probably understand what “hallucination” means in the context of ChatGPT but do our very busy colleagues, customers and even family members know that? The word itself, “hallucination,” is weird. It’s not strong enough. Better that we say that ChatGPT can be a very, very convincing liar. It invents facts (“hallucinates”) and presents them to you calmly and with great authority. Worse, it mixes these lies into a collection of truthful facts.

We all need to learn how to spot the alligator hiding out at the crocodile convention.

I know ChatGPT hallucinates. You know it does that. I even understand why it does that. But a hyped up, super-busy lawyer? The answer, it turns out, is no. I think we should cut him some slack. I think a lot of people have heard that word, “hallucination,” but they don’t realize what it really means when they sit down and ask ChatGPT for help on some task. We need to give the lawyer some slack because for the purposes of this conversation, we’re all lawyers — we’re all busy trying to solve whatever problems come across our desks. Even if we personally avoid making mistakes, it’s inevitable that people in our circles will.

So think about setting the schadenfreude aside and figure out how we can prevent this from happening to us, our colleagues and anyone else inside our circle of trust.

--

--

Paul Galvin

Author and Practice Director @ Neudesic/IBM. Father, Not-Great-But-Always-Trying Zen Buddhist. “Yet Another TypeScript Book” (https://amzn.to/2ABntAX).