Member-only story

Featured

The Concerning Truth Behind OpenAI’s $200/Month ChatGPT Pro Plan

Human workers, beware

Thomas Smith
The Generator
5 min readDec 19, 2024

--

Illustration via Midjourney

OpenAI shocked its customers earlier this month when it rolled out a $200/month ChatGPT subscription, dubbed ChatGPT Pro.

At first glance, paying $200 per month for access to a chatbot feels insane.

But there’s a deeper reason why OpenAI wants to create this higher-priced tier. It marks a big change in how AI companies are positioning their products — and is concerning news for human workers.

Loss Leader

When OpenAI released its ChatGPT chatbot, it wasn’t thinking at all about the economics of generative AI.

Mostly, that’s because they make absolutely no sense. Running a single query through ChatGPT reportedly costs several cents, which is orders of magnitude more than a traditional search engine query.

And that doesn’t even account for the costs of training LLMs in the first place — just creating GPT-4 reportedly cost nearly $100 million.

Running LLMs also uses a significant amount of electricity and water for data center cooling. A ten-message conversation with ChatGPT reportedly consumes a bottle’s worth of fresh water.

--

--

The Generator
The Generator

Published in The Generator

The Generator covers the emerging field of generative AI, with generative AI news, critical analysis, real-world tests and experiments, expert interviews, tool reviews, culture, and more

Thomas Smith
Thomas Smith

Written by Thomas Smith

CEO of Gado Images | Content Consultant | Covers tech, food, AI & photography | http://bayareatelegraph.com & http://nofrillsinfluencer.com | tom@gadoimages.com

Responses (65)