Five Reasons Why ChatGPT will NOT take over your job

Afarin Bellisario
The Counterview
Published in
3 min readDec 27, 2022

ChatGPT is all the rage these days. Articles by reporters in major newspapers and opinion pieces by economic gurus tout the demise of writing and writers — and coders — for good. Meanwhile the public fascination with the tool — which is in the beta stage — has broken the system, causing delays and malfunction. But if you are a writer — or coder — rest assured: Your job is secure, at least for now.

These are five reasons why:

  1. ChatGPT is unreliable. Even the website of its developer, OpenAI, admits that it “occasionally generates incorrect information”. It actually does worse. When the program doesn’t know the answer to a query, it makes it up without admitting insufficient knowledge.
  2. It doesn’t know the difference between fact and fiction: Ask the program to create a story about anything — say about aliens invading a Texas town — and it spews out a fictitious account that can contain harmful or biased statements, again as is admitted by the OpenAI web site.
  3. It is inconsistent and responds differently to the same query stated differently: To use the program properly requires stating and sometimes parsing the query in a specific manner as explored by a number of papers, including one by Ben Dickson at Venture beat.
  4. Its knowledge of the world is limited: The model is trained using selected databases and human expert responses as of 2021. Unlike a search engine, it doesn’t interrogate the internet constantly to obtain new information.
  5. It consumes lots of energy, and the operating cost will be astronomical. An AI model such as ChatGPT with millions of parameters requires lots of processing and memory to run. Therefore, it is very energy intensive. The program is also expensive to train and run. The cost of training so far has exceeded $1B. And the system is still in its infancy — as admitted by the OpenAI CEO. It will cost many more billions to be fully operational. As for operations, TechCrunch estimates it will cost $87,000 per year on cloud-based servers such as AWS to run just one instance of the program — you need many instances to support the expected usage. That does not include the cost of software maintenance and upgrades.

The bottom line is that ChatGPT, even once matured, will require humans to properly use it to formulate the query and check the output. While powerful, without a human operator it is good for little except for generating misinformation on an industrial scale!

It is not a search engine, either, although it can sometimes be interfaced with one. So, please don’t short Google stocks just yet.

DALLE’s version of ChatGPT

ChatGPT is a tool. The mistake is to overstate its capabilities and assign extraordinary power to it. We can’t look to it to provide solutions it can’t. Humans have always used instruments to facilitate performing their tasks. The tools these days are electronic. The use of typewriters or word processors didn’t reduce the quality of writing; neither should the use of smart tools. These days we use spell checkers and editing programs instead of dictionaries and thesauruses. More and more these programs use AI to do their job. That doesn’t make these tools authors. Grammarly corrects our grammar in a speedy manner. But it does not produce the next great book. The best smart tools “suggest” alternatives or assist in tasks. But only humans make the final choice. The trick is to be the master of the tool, not a slave to it.

--

--

The Counterview
The Counterview

Published in The Counterview

The Counterview bridges the gap between technology and the human condition, with a focus on ways to hold on to our humanity while reaping the benefits of technological advances.

Afarin Bellisario
Afarin Bellisario

Written by Afarin Bellisario

I’m Afarin Bellisario, a Boston-based writer, and mentor. I am a bridge between East and West, Past & Future, equally at home with technology and humanity.

No responses yet