The Generator

The Generator covers the emerging field of generative AI, with generative AI news, critical analysis, real-world tests and experiments, expert interviews, tool reviews, culture, and more

Member-only story

Featured

The Fate of the Lawyer Who Cited Fake Cases from ChatGPT

Thomas Smith
The Generator
Published in
4 min readMar 4, 2025

--

Illustration via Ideogram

When attorney Thad Guyer filed a legal brief in an employment law case, it seemed like the kind of thing a lawyer would do thousands of times over their career.

Only this time was different. Guyer was about to land himself in hot water and risk his entire career.

Guyer had used ChatGPT to help him and his client prepare the brief. Not understanding how these models work, he had given the brief a cursory read-through, but hadn’t thoroughly checked it.

The brief cited a variety of legal cases. It turned out that many of them were misquoted or totally irrelevant — they were blatant hallucinations dreamed up by the LLM.

The response was immediate and incredibly serious. Guyer faced sanctions from the court, as well as the possibility of being disbarred and unable to practice law.

Now, we’ve finally learned about Guyer’s fate. A federal judge, Thomas Cullen, reviewed the case and decided whether sanctions should proceed.

--

--

The Generator
The Generator

Published in The Generator

The Generator covers the emerging field of generative AI, with generative AI news, critical analysis, real-world tests and experiments, expert interviews, tool reviews, culture, and more

Thomas Smith
Thomas Smith

Written by Thomas Smith

CEO of Gado Images | Content Consultant | Covers tech, food, AI & photography | http://bayareatelegraph.com & https://aiautomateit.com | tom@gadoimages.com

Responses (25)