Will ChatGPT replace you?

Theo Seeds
11 min readJan 30, 2023

--

Imagine: the year is 2005, you’re 18 years old, and you’re headed off to college for the first time.

Your school makes you pick a major right away. You’re a practical person, so you decide to study accounting.

“It’s not glamorous,” you say. “But it’s stable work, I’ll get paid well, and I’ll get to have a good life. After all, people are always gonna need people to keep track of their money. Right?”

Wrong.

Here’s the problem: services like QuickBooks and TurboTax are making it way too easy for people to do everything an accountant does, all by themselves. When you can do your taxes in an hour, why hire an accountant?

Today, a lot of CPA’s are calling it quits and switching industries. They got replaced by a software.

Garry Kasparov, the first man ever to be replaced by a software.

Getting replaced by a software

Unless you’ve been living under a rock, you’ve probably heard about ChatGPT — the most advanced artificial intelligence that’s ever been released to the public.

You can use it for almost anything. You can tell it to teach you how to play “Go”. Or you can tell it to compose a sonnet about butterflies in iambic pentameter. Or you can tell it to write code. Or you can tell it to write an article kinda like this one.

Did I use ChatGPT to write this article?

ChatGPT can write essays good enough to pass a 12th grade English class, with minor tweaks by the student. Schools are freaking out because they can’t tell an AI essay from a normal essay. Some schools are even making kids write first drafts of their essay in person.

And ChatGPT just made it to an interview round. A company conducted a hiring audit by creating a resumé and cover letter with ChatGPT and then applying with it. Guess who got called in for an interview?

If ChatGPT can write quality 12th-grade essays and corporate-quality job applications, what else can it do?

Some people say that ChatGPT is the first harbinger of the apocalypse. The AI revolution is coming, and we’re all gonna lose our jobs.

Other people are saying that ChatGPT is no problem. ChatGPT is just a tool, and a human will always be better than an AI. We have nothing to worry about.

I don’t think either of them are right. And here’s why.

How does ChatGPT work?

ChatGPT is basically a super-search engine. When you ask it a question, it searches the internet for as much information as it can get to answer that question.

ChatGPT goes one step further than Google. When you ask Google a question, Google finds a bunch of websites that are related to your query and then lists them for you. When you ask ChatGPT a question, it finds a bunch of websites that are related to your query and then summarizes them for you.

That’s all ChatGPT does. It doesn’t actually have ideas of its own. Every single thing ChatGPT spits out is a synthesis of stuff that’s already on the internet.

What does that mean? Well, it means ChatGPT can help you solve problems the way everyone else would solve those problems. It can do lots of things at novice-level, a few things at intermediate level, and basically nothing at expert level.

If you’re still having trouble understanding, a great analogy is chess. At the lower levels of chess, players don’t really win games. They only lose games. They make a bad move, they lose their queen, and then they get crushed. So when you’re new at chess, you win by not making mistakes.

But at the upper levels of chess, players win games. They see an incredible combination of moves that their opponent doesn’t see — like a brilliant queen sacrifice, or a 6-move pattern that leads to forced checkmate.

Same thing with painting. When you’re first starting out, you should just focus on copying what you see. But once you’re good, you can start breaking the rules.

Could an AI have made this?

Same thing with just about everything. Most people can’t be brilliant when they’re first getting started. They can’t come up with new, amazing ideas. All they can do is follow the rules.

However, once you reach the upper levels of something, you can start breaking the rules. That’s because you understand the rules well enough to see the exceptions.

This is how human society makes breakthroughs. That’s how you get cubism, iPhones, and general relativity. People have to understand what they’re doing well enough to break the rules and create something new.

ChatGPT can spit out things other humans already know. But it doesn’t really understand anything — it just copies what other people say. It can’t use the information it spits out to come to conclusions — only a human can.

Following the herd.

Imagine you’re in Scotland. It’s the 10th century A.D., and you’ve decided to make a living herding sheep.

How do you start? Well, sheep evolved to follow a leader. If you’re a sheep then you look for the alpha sheep, and you follow them around.

A human can hijack this system by becoming the sheep’s leader. If you do that then the sheep will start following you around. Then you can herd them pretty much however you want.

You can do the same thing with horses, cows, pigs, and just about other animal we’ve domesticated. You can’t do this with zebras, which is why we don’t have zebras running in the Kentucky Derby.

Here’s a deep, dark secret about human nature. No one will ever tell you this because it’s taboo, but we all know it’s true: people are a herd animal, too.

This is how you get stuff like Nazism. People in Weimar Germany were upset, so they wanted a strong leader. Hitler came in and told everyone what they wanted to hear, showed strength, and sold everyone on his weird ideology. Then everyone started copying Hitler, buying into what Hitler wanted to do, and the Third Reich was born.

After World War II, the psychologist Stanley Milgram wanted to understand why ordinary people would follow a leader as far as Germans had followed Hitler. So he put on a white lab coat to make himself look more credible. And then he told people to give somebody life-threatening electric shocks. Over half of them did it.

Milgram’s point was, a whole lot of people will do whatever authority figures tell them to do. In other words, humans are a herd animal.

That’s why most people try to dress like their favorite celebrities instead of developing their own style. When clothing companies want to get people to wear their clothes, they pay somebody famous to wear their clothes. Some celebrities cut out the middleman and make their own clothes.

This is also why most businesses just try to copy their competitors. The Android is basically a rip-off of the iPhone: the user interface is almost exactly the same. Tesla added a touchscreen to its cars and now every car company is adding touchscreens to their cars. TikTok became more popular than Instagram, and now Instagram is trying to become a video platform.

The sad truth is, independent thinkers aren’t that common in our world.

Who’s gonna get fired?

Remember the story about accountants, from the beginning of this article?

Some accountants are switching careers. But other accountants are charging even more than they used to — and getting away with it.

Why? They don’t call themselves “accountants” anymore. Instead they brand themselves as “fractional CFO’s”.

That means they don’t just keep your books and do your taxes. They also consult with you on business problems. Instead of just tallying your income and spending, they tell you what spending you should cut — and what income you should try to beef up.

In other words, they do more than QuickBooks does. They became independent, creative thinkers who could give useful business advice.

Right now there’s a class of people who get paid to summarize information and copy others. If you’re an assistant to a congressman, your job is to read through big dense bills and then summarize them. If you’re an SEO intern, your job is to learn about topics by searching through the internet and then vomit everything you learned into SEO-friendly articles.

These people make money by following directions and doing what their bosses tell them to do. Whenever they’re in doubt, they’re supposed to do what other people in their field do. They’re not supposed to have new ideas.

A lot of those people will lose their jobs. ChatGPT can follow basic orders too, and it’s a lot less expensive than hiring a full-time employee. Why would you pay a human to do in a day what an algorithm can do in less than a second?

But the experts — the people who bring new ideas to their field, and who don’t follow the herd — will not get replaced.

So, which one are you?

Here’s a test you can use to help figure that out. Try to tell the difference between something a human made and something an AI made.

You can see an example of this with the recent chess cheating scandal. Hans Neimann got caught cheating because he made a fishy move. No human would have made the move he made — only an AI could have. That meant he was cheating, for sure.

But the tournament organizers didn’t figure that out. Magnus Carlsen, Neimann’s opponent and the greatest chess player alive, was the only one able to figure out that Neimann was cheating. Because Magnus understands chess so well, he knows that no human would ever make the move Neimann made.

Similarly, if you’re gonna tell the difference between a human with original ideas and an AI, you have to be able to spot an original idea. And if you can’t spot an original idea, you probably can’t have an original idea.

Let’s start right now. Who wrote this article — me, or ChatGPT? And if ChatGPT played a role in writing this article, how much? (The answer is at the end of the article.)

ChatGPT and society

For about a million years or so, human beings lived by hunting and gathering for food. Everything in their lives revolved around hunting and gathering. They taught themselves all the hunter-gatherer skills they needed from a young age, and they got really good at it.

Then, about 10,000 years ago, everything changed. Some guys in modern-day Iran figured out they could put seeds in the ground, water them a bit, and turn them into bread.

After that, the early Mesopotamians retrained themselves. Instead of learning to hunt mammoths from a young age, or tell the difference between poisonous mushrooms and delicious mushrooms, they learned how to work in the fields. And they learned the best methods to cultivate wheat crops.

After a few generations, the Mesopotamians were great farmers, but they weren’t so good at hunting and gathering anymore.

Then, about 200 years ago, some guy in Britain came up with a thing called the steam engine. Once that happened, farming was out, and factory work was in. Within a generation, people had retrained themselves to work in factories en masse.

In the near future, you’re gonna have to have independent thoughts. You’re going to have to see the world in ways that a machine can’t, which means not just summarizing information but making sense of information.

If you’re afraid that you’re not an original thinker, there’s good news. There’s evidence that the human brain is incredibly plastic. If you play lots of chess growing up, for example, you’ll develop a larger prefrontal cortex so you can think through chess problems.

The same thing applies to creativity — probably. Creativity and independent thinking can be taught. I think.

So my prediction is, the same way people in early Mesopotamia realized they needed to teach their kids farming instead of hunting, and the same way people in 1800’s Great Britain realized they needed to teach their kids factory work instead of farming, people today will realize that they need to teach their kids to think for themselves. And 60 years from now, the world will be full of independent thinkers doing all kinds of awesome stuff.

This might not happen right away. The shift to a creativity economy will probably take a generation or two, because governments and big corporations tend to move pretty slowly. They’ll be hesitant to fire their humans and replace them with AI’s until they absolutely have to.

But eventually, every single human being on Earth will need to be able to think creatively. Otherwise, they won’t be useful.

How to think for yourself.

I like to finish my articles by giving my readers some advice about how to improve themselves, with respect to the message in the article.

Unfortunately, I haven’t developed a training program for creativity. And I don’t know of anyone who has.

But, here’s what I would do if I was trying to make myself an independent thinker from scratch: ask myself questions I don’t know how to answer.

Think about philosophical questions. Write short stories. Create strategies for solving difficult problems. Think about big-picture stuff, like “how will ChatGPT change the world?”, and try to come up with answers.

Spend 30 minutes thinking about this stuff per day. Analyze your thought process. Ask yourself if you’re asking the right questions.

Do it until your brain hurts. This will exercise your creative muscles, the same way crunches exercise your abs.

Don’t copy what anyone else says about these questions. Challenge your brain to have original thoughts.

(By the way, in case it isn’t clear, I wrote this article without any help from ChatGPT.)

Hi! My name’s Theo and I post weekly articles about whatever was on my mind the week before — usually related to social science, human nature, the inner workings of society, and long-term trends in the world.

If that’s your thing, and you enjoyed this article, you might like some of my other posts, too. You can find those by scrolling down and/or clicking on my profile.

Or, if you wanna see my stories in your feed, hit the big green “follow” button — and these posts will start popping up when you visit the Medium homepage.

What do you think is gonna happen with ChatGPT? Is it gonna kill us all? Is it gonna be obsolete in a year? Feel free to share your thoughts in the comments. I read them all and if you say something interesting, you might change my perspective.

Happy trails!

--

--

Theo Seeds

Digital nomad, freelance writer, eternally curious. Join me as I try to crack the code on human nature.