Member-only story
Featured
The Concerning Truth Behind OpenAI’s $200/Month ChatGPT Pro Plan
Human workers, beware
OpenAI shocked its customers earlier this month when it rolled out a $200/month ChatGPT subscription, dubbed ChatGPT Pro.
At first glance, paying $200 per month for access to a chatbot feels insane.
But there’s a deeper reason why OpenAI wants to create this higher-priced tier. It marks a big change in how AI companies are positioning their products — and is concerning news for human workers.
Loss Leader
When OpenAI released its ChatGPT chatbot, it wasn’t thinking at all about the economics of generative AI.
Mostly, that’s because they make absolutely no sense. Running a single query through ChatGPT reportedly costs several cents, which is orders of magnitude more than a traditional search engine query.
And that doesn’t even account for the costs of training LLMs in the first place — just creating GPT-4 reportedly cost nearly $100 million.
Running LLMs also uses a significant amount of electricity and water for data center cooling. A ten-message conversation with ChatGPT reportedly consumes a bottle’s worth of fresh water.