AI is here to stay, and so is your job

Nick Partie
b8125-fall2023
Published in
4 min readNov 19, 2023

“AI could replace equivalent of 300 million jobs” (BBC), “Will AI replace your job? New study reveals the professions most at-risk by 2030”, (NBC), and “AI, Automation likely to destroy or change jobs faster than believed” (USA Today). All these recent news article headlines reference the peril ahead as a result of advances in computing and generative AI. Since the release of OpenAI’s Chat GPT less than a year ago, you would be hard pressed to open any newspaper that doesn’t make some reference to the advances — and often fears — swirling around generative AI. OpenAI’s revenue was just $28 million last year. According to Sam Altman in their release of Chat GPT-4 turbo, OpenAI is now generating more than $100 million per month. One of their key GPU chip providers, NVIDIA, has earnings expected to grow 221.6% for the current fiscal year. Alphabet’s latest earnings call mentioned AI 50 times, followed by Meta with 49 references, and Microsoft with 46. According to PwC, the AI market is predicted to contribute $15.7 trillion to the global economy by 2030. Despite all this fear and hype, I am here to tell you that generative AI will not take over your job. More likely, someone who becomes generative AI savvy will take over your job or government regulation, slower than expected adoption, and historical precedence technological advances creating versus destroying jobs will keep you on payroll.

To put this into historical context, it’s important to consider the advances in digitization over time.

1) Origins in Computing (20th Century):

- Early computing machines like ENIAC (1946) marked the beginning of a shift from analog to digital data processing.

2) Emergence of the Internet (1960s-1980s):

- The creation of the ARPANET in the 1960s, a precursor to the modern internet, played a crucial role in connecting computers and facilitating the exchange of digital information.

3) Personal Computing (1970s-1980s):

- The introduction of personal computers, such as the Apple II and IBM PC, brought computing power to individuals and businesses. This led to the creation of digital documents, databases, and software, marking a significant shift from paper-based systems.

4) Rise of the World Wide Web (1990s):

- The invention of the World Wide Web by Tim Berners-Lee in the early 1990s revolutionized the way information was accessed and shared.

5) E-commerce and Digital Business (1990s-2000s):

- The late 20th century witnessed the rise of e-commerce, with companies like Amazon and eBay pioneering online retail.

6) Mobile Revolution (2000s-Present):

- The proliferation of smartphones and mobile devices further accelerated digitization.

7) Cloud Computing (2000s-Present):

- The advent of cloud computing allowed for the storage and processing of vast amounts of data remotely. This enabled businesses and individuals to access resources and applications over the internet, reducing the reliance on local infrastructure.

8) Big Data and Analytics (2010s-Present):

- The increasing volume of digital data generated by various sources led to the emergence of big data analytics. Organizations began leveraging advanced analytics tools to extract valuable insights from large datasets, informing decision-making processes.

9) Industry 4.0 and Internet of Things (IoT):

- The concept of Industry 4.0 emphasizes the integration of digital technologies into manufacturing processes. IoT devices, interconnected through the internet, enable the collection and exchange of data in real-time, fostering automation and efficiency.

10) COVID-19 Pandemic (2020):

- The global pandemic underscored the importance of digital technologies in maintaining business continuity, enabling remote work, and ensuring access to essential services. It accelerated the adoption of digital tools and highlighted the resilience of digitized systems.

In the mid-20th century and beyond, as computers began to be integrated into workplaces, there was both excitement about the potential for increased efficiency and concerns about job displacement. Here we are today and the push for digitization is alive as it ever has been, with continuous advancements in artificial intelligence, machine learning, and other emerging technologies shaping the digital landscape.

All this considered, it is rare for the U.S. Congress to align on just about anything these days, yet regulation around big tech seems to be one of those subjects. Recently, the Biden Administration released an Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence. While there may not be any teeth to this order, it shows how big of a priority it is and likely will be the first of many more to follow. This regulation will undoubtedly ensure rapid job replacement and unchecked levels of unemployment do not occur. As far as the pace of adoption, fancy technological advances always catch the eyes of market makers and publishers, yet always seem to fall short of expectations. We’ve been 5 years away from autonomous vehicles becoming mainstream for the past 20 years. EV’s are an even more extreme example of the slow pace of adoption that often follow major hype spikes. Finally, even if generative AI proliferates as quickly as forecasts suggest unfettered of regulation, historic precedence suggests yet another wave in the digitization timeline will ultimately result in more jobs versus fewer. The World Economic Forum estimates that by 2025, technology will create at least 12 million more jobs than it destroys, a sign that in the long run, automation will be a net positive for society. So while it is absolutely remarkable seeing how quickly this technology is advancing, it’s important to put it into the context of the historical digitization journey that we’ve been on for quite some time now, recognizing societal checks often keep some of the negative externalities in place.

--

--