The fear of AI taking your job β€” an alternate point of view.

Over the last six months, I have gone through a spectrum of emotions on how AI might impact me as a professional software engineer.

In our industry, there are clear and undeniable ways that AI is now affecting how we work.

Many others write about what tools are impactful β€” this is not the focus of this article. For this article's sake, let's say that 95% of these tools are simply a thin software layer on top of various OpenAi APIs and are used for tasks like writing unit tests or building parts of a UI.

On my new podcast, "Stackability Ai,” we discuss what a hype cycle is and how I have used a framework we discussed in the episode to reframe my thinking and turn fear into inspiration.

Let's zoom in on this one use case and explore what it would take for AI to profoundly impact the livelihood of all software engineers. But first, we need to understand the current state of general-purpose AI.

Enter generative pre-trained transformers (GPT)

GPT architecture enables a smart auto-completion mechanism based on statistical outcomes and data training samples. As you provide input, it guesses the next token in the series with exceptional accuracy.

This is an important concept because you are now armed with the basics of what presumably might be taking your job.

Next, at what point would this Ai be able to do my job? In the case of a software engineer, it could be one of the following among many examples:

  1. Successfully architect and/or build an entire full-stack greenfield project via requirements of varying detail.
  2. Maintain an existing large production system comprised of legacy code and ensure that the system still works without regressions once put into production.
  3. Maintain/build a reliable release pipeline and any other parts of my infrastructure responsible for testing, building, and shipping my code.
  4. Ensure that no private data is compromised and makes its way into the wrong hands.

There are hundreds of other functions to one's job in this field, but I paint in broad strokes to make my point to the general mass.

What value can generative AI provide for software engineers?

Here are a few examples:

  1. It's an excellent co-pilot. It can write my unit tests and small parts of my code for me (UI components, SQL statements, parts of a service layer, etc.) thus accelerating my project's development.
  2. It can analyze parts of a codebase, summarize them for me, and document the changes along the way.
  3. It can help me learn new things because I already know the fundamentals of what I am doing, so examples of other languages or concepts will come easier with bite-sized examples that I can use immediately.

What generative AI can not do for engineers

  1. As of this writing, it cannot manage a development team.
  2. As of this writing, it cannot produce an entire complex system with accuracy (however, AutoGPT is trying and is very cool indeed)
  3. As of this writing, it can not produce anything without minor hallucinations along the way. This goes for text, images, and code.

Now that you have a very basic mental model of the current state of general-purpose AI and some examples of day-to-day functions it could do, we can get into the meat of my thinking.

Until some AI can legitimately think for itself and (here is the kicker…) self-replicate, learn from its shortcomings, and reach a point of intelligence explosion, our job killer is just a super-optimized and efficient chatbot.

Would you still be worried about this taking your job as a software engineer?

I might say yes under the following circumstances:

  1. You are not a constant self-learning individual and are likely very junior in your role and have yet to have the chance to demonstrate ample value to the organization.
  2. You are limited on the stack that your organization uses. It's more important than ever to be as full stack as possible. You don’t need to be an expert at the whole stack, but you should understand the fundamental concepts of all technology used within your team.
  3. You might not understand the β€œwhy” behind what your team is working on therefore, you are increasingly disposable and risk the chance of being automated.
  4. You are not honing your communication skills regularly and are not constantly working on mental models to communicate within your team effectively.

Again, the list can go on, but I hope the point is clear.

Conclusion

I would love to read this post a short time from now to compare it against the current state of affairs.

Until then, my point of view is that until these systems are truly intelligent and are more than input/output systems and can self-replicate and self-improve without the input of human beings, I think we will be okay from a job standpoint (at least for us engineers). I will analyze other industries in future posts.

This being said, some jobs will undeniably be replaced as value and profit margins are the number one focus of business. But I don’t believe it will be a blood bath, and hey, it's fun to now have an excuse to keep stacking your skills!

Lastly, don’t forget we still need many high-paid individuals to prop up our economy and spend money at scale to fuel everyday life unless we move away from a capitalist society here in the US (which I am certainly not suggesting as a good idea).

This is a separate rabbit hole, but I don’t think it would be good for humanity if white-collar jobs were suddenly replaced, and it's hard for me to see the powers that disagree on this.

I hope you enjoyed this article. Until then, I will leave you with a short YouTube talk I came across recently that visually illustrates some of these concepts and is fun to watch.

Disclaimer: No part of this article was written by AI besides the leading image above.

--

--