Why I don’t use ChatGPT anymore

Kenny Wolf
Geek Talk
Published in
5 min readApr 25, 2024
Article Cover — ChatGPT Logo strikethrough

I work as a Junior Software Engineer and came into IT as a career changer.

For this reason, I’m now pursuing a part-time bachelor’s degree in computer science to build up a fundamental knowledge of IT.

In all areas I have been using ChatGPT for the last few months to support (or even replace) me.

But some time ago I decided to stop using ChatGPT, and what I realized was frightening. In this article, I share my learnings and my reasons why I don’t want to use ChatGPT or other AI tools much (at least for now).

Rejection

When ChatGPT first came out and the hype was huge, I was a bit skeptical.

Many people thought it would be the death knell for software developers. I was skeptical and observed the whole thing from a distance. After a while I saw what ChatGPT was capable of, and what it wasn’t.

Creating a code snippet for a regex went quite well.

But integrating Giphy API with shuffle mode (like in Slack) in a CMS backend for editors — no chance. It’s also clear why. Like any other model, it must be able to learn from existing data. In other words, an AI can only “create” things that already exist.

And in my case, there wasn’t a tutorial, blog post, Stack Overflow entry or any other reference to my use case on the internet (the developer documentation from Giphy was poor at the time).

So I didn’t think much of it and didn’t incorporate it any further into my everyday life as a software developer or student.

Recognition

ChatGPT was good for the little things like regex.

Not just for that, but for other things too. When I started my studies, I had some trouble with a few topics (assembly, C programming). When I was desperate, I opened the OpenAI website and asked how a hex dump in C works.

Like a gateway drug, this was the starting signal for an unhealthy relationship.

At first I used it more as a better Google to get things explained to me. But over time I used it to even create the solutions for myself so I could customize them. Eventually I got to the point where I was throwing all the tasks into the prompt in the hope that the ideal solution would be generated and I would have little to no customization to do.

At the time, I didn’t realize how dependent I was on it and what it was doing to me.

Dependence

This assistant has helped me a lot.

But I also had a lot of frustration with it. Countless prompt attempts with outputs that were far from the solution. I spent so much time and it still didn’t work. In other words, the solution was wrong.

The worst thing about it all was that I didn’t really know what I was doing.

As I said, I’m still very new to IT (nearly 2 years). I lack a lot of knowledge and experience that I’m building up. Then I came across this video from “dreams of code”: Why I’m no longer using Copilot. And then I understood what was going on with me.

Realization

In the beginning I used GPT to fill my knowledge gap.

But then it became more and more a “replacement” for my brain power. I realized how dependent I was and that I had switched off my brain. So I decided to stop using GPT for a while.

What happened was very interesting.

As already mentioned, my brain switched off more and more and went into passive mode.

I also realized that the work I was delivering was okayish at best. This was not what I wanted to deliver, because I want to become a competent software engineer.

What I also realized is that it took away the “eureka” moments.

There’s no better feeling when you’ve been working on a problem for a long time and have been able to solve it yourself in the end. But I no longer had that feeling. It was more like ticking off a task. And when I had achieved it after many prompt attempts, I was simply happy to have the solution.

I didn’t learn anything from the task.

I also (unconsciously) no longer wanted to learn, I just wanted to solve the tasks. But that’s the wrong approach. Both at work and at university, I’m there to learn and grow. And there is simply no shortcut to becoming a good engineer.

Zombies

I saw the “zombie effect” the most at school.

Those who used AI tools have stunted their own thinking to a certain extent. Independent and critical thinking is difficult in passive mode.

Recently, I received a code snippet from a fellow student for a collaborative project. I wasn’t quite sure about one part and asked what exactly it did.

He opened the OpenAI window and read me word for word what GPT had written…

What about Copilot?

I have been working as a software developer for almost two years now (8 months internship, then employment).

In all this time I have not used tools like Copilot to help me develop directly. At the very beginning when I was a trainee for a few weeks, I saw one of my seniors using Copilot. He was just using it to test how good it was.

I saw how good the autocompletion was and wanted to set it up myself.

But my senior insisted (luckily!). He said I was still too inexperienced and it could give me bad habits as it was still prone to errors.

He also said: You should use these AI tools if you know what you’re doing.

A few months later, a trainee started with us who used Copilot. Whenever I helped the new trainee with something, we worked in pair programming style. And there was one scene I’ll never forget.

I told him to give Approach X a try to solve a problem. He started writing something in the IDE, waited for Copilot to suggest something and tabbed (to accept the suggestion). I then said that’s not what we should be doing.

He deleted the whole thing, started writing something else… waited and tabbed again.

Again I said, we don’t need that, we need to do it differently. He was completely lost and tried writing something else and waited for the autocompletion.

Luckily I listened to my senior.

Summary

Don’t get me wrong. I’m not demonizing all AI tools.

I firmly believe that AI tools can support us more and more in our everyday lives so that we can do more.

But I’m still at the beginning of my IT career. And at this stage, it’s important that I find things out for myself and learn to think like an engineer.

That’s why I personally won’t be using AI tools until I have a certain level of seniority and know what I’m doing (some would say you’ll never know what you’re doing).

Until then, I will only use AI tools as a better Google, like Leo from Brave which is already integrated.

--

--

Kenny Wolf
Geek Talk

I write about tech, software development and hacking for non-techies and geeks 🤓 | Software Developer 👾 | Interested in pentesting 👹