Elon Musk, CEO of Tesla, labels artificial intelligence (AI) an existential threat — citing it as the most likely cause of a third world war.
And yet, while automation has always required social and economic change to adapt to the emerging disruption, technological change has been the dominant driver of growth within and across countries throughout history.
The Road So Far
Economic growth during the post-WWII era, especially following the 1960s, surged because of discoveries in information technology (IT), such as the internet and personal computers. These advances have fundamentally altered organizational structures, productivity within firms, worker productivity, and innovation. However, the rise of IT has also amplified inequality.
The dominant economic paradigm for understanding these trends is the “skill biased technical change” framework — that technological developments are associated with increases in the relative compensation of skilled workers. For example, the fraction of individuals earning a college degree has grown from 15% in 1970 to over 35% by 2015.
Over the same period, the wedge in hourly wages between college and non-college degree workers has grown from 56% to nearly 200%. Indeed, increasing computerization is rapidly making many jobs obsolete, necessitating that former employees to find new careers and/or re-train.
While the effects of increasing automation have affected labor force participation among non-college degree workers the most, even prompting many entry-level jobs to require a college degree simply as a way of screening candidates, college degree workers have also experienced a slow down in their relative earnings position.
Based on my research, only those who have a college degree and work in IT-intensive occupations have continued experiencing sustained growth in their earnings and employment opportunities.
In fact, even John Cryan, CEO of Deutsche Bank, has commented that bankers are at risk.
Learning in Perpetuity
Despite the surge in automation, “soft skills” have become more important.
Employment growth has been strongest in occupations with high demands for both social and cognitive skills.
As technological requirements grow, mastering not only technical content, but also communication and leadership skills will be a necessity, at least to some extent, for all workers. In fact, Kassey Vilches recently wrote about the growing importance of learning through “deep work” (coined by Cal Newport) in the emerging economy.
For example, given the nature of product innovation, a software engineer cannot create code that is disconnected from organizational aims, nor can the engineer create code independently of the brand and content that marketing might be trying to promote based on the organization’s aims.
In this sense, learning is no longer a luxury, but rather a necessity, to stay competitive.
Individuals must increasingly become expert learners — launching into new and uncertain situations with a genuine desire to learn and problem solve — to remain competitive.
While technology will continue making some tasks obsolete, and the composition of tasks in the economy will evolve over time, people will not become obsolete if (and only if) they continue to learn and upskill.
Indeed, throughout every period of technological discovery, ingenuity and hard work have always paid off dividends, both financially and socially, which is precisely what creates the incentive for additional discoveries in the feedback process that we call “the economy”.
Becoming Indispensable in Spite of AI
How are you going to make yourself valuable so that increasing automation does not displace your job?
While I cannot speak for you, let me share two examples from my own world.
First, my responsibility as a researcher is to produce not only publications that are marketable to a (possibly narrow) audience, but also ideas that have the potential to positively impact individuals and organizations.
The beauty of such a role is that it demands both creativity and logistical efficiency to execute an idea from start to finish. Details matter, but the story also matters.
AI is changing the way that researchers develop hypotheses and test them. For example, scientists have already successfully programmed software that can simultaneously analyze data, test hypotheses, and create conclusions from start to finish. While such a story might frighten some researchers, the scale of full-cycle discoveries has been modest and there is a continuing requirement for hypothesis development and creativity in the research process.
In fact, AI has a lot to offer towards the social science agenda of causal inference and program evaluation.
Researchers often make ad-hoc assumptions, such as filling in missing data with an average or imposing parametric assumptions to estimate a counterfactual simulation, but AI provides an opportunity to train algorithms with data and human judgment to reliably extract and flexibly learn from relevant features of the data.
Second, my responsibility as an academic is to not only convey material, but also invest and multiply the talents of those around me.
Faculty have an incredible privilege to interact with a wide audience, ranging from students to practitioners to policymakers, and to instruct from a position of mutual trust and respect. However, to make every interaction count, faculty can play a unique role in multiplying the talents of others by empowering them make to decisions and grow their human capital. Liz Wiseman refers to these leaders “multipliers” — drawing together talent to improve things.
Technology is transforming the way that faculty can interact with students and track their progress. For example, using state-of-the-art tools, Arizona State University (ASU) has deployed EdPlus, arguably the largest accredited online learning degree platform, for instructing learners across a variety of disciplines.
Georgia State University and ASU are using these types of platforms and algorithms to not only track mistakes and adjust the curriculum to provide students with additional support over areas that need more practice, but also continuously assess how students are doing overall and pinpoint those who might be heading for trouble so that educators can intervene with help when necessary.
The development of online learning platforms has forced educational institutions to think more critically about how they plan to add value to each student’s life and convey material more efficiently and effectively.
What Will You Do About it?
The response to AI is not to retreat, but rather to learn more. While I’ve laid how two examples of how I plan on leveraging AI, how you do it in your line of work over the course of the career will inevitably differ.
If you liked this post, follow me here and give a clap! Feel free to check out my academic webpage at www.christosmakridis.com.