Over the past five years, including while writing The Innovation Delusion with Andy Russell, I have been reading a number of works that paint a similar picture of technology and the economy in the United States. Yet often they aren’t in conversation together, I decided to write this post, structured as seven interrelated theses, synthesizing the picture as I see it today with the hope that it will be useful to others.
1. The 1950s-1970s — An Extraordinary Time
Many current discussions of technological and economic change start from the rapid period of economic growth and technological development that took place between roughly 1950 and 1970. To some degree, this is a bit unfair. As historian Marc Levinson put it in the title of his book on the period, it was An Extraordinary Time. Levinson’s book is the best account we have so far of how multiple factors led to the exceptional increase in growth between World War II and the 1970s and how multiple factors led to its decline and stagnation. The point, though, is that, with brief exceptions, economic and especially technology-centered productivity growth have been slow since about 1970.
2. “Innovation” Was Meant to Be Our Savior, but It Has Failed Us
In the 1960s, just as this boom period was nearing its end, economists and policymakers fixated on technological change, or “innovation,” as the primary driver of economic growth. (Godin, Vinsel/Russell, Wisnioski) Since that time, we have literally heard more and more talk about “innovation,” or what Andy Russell and I call innovation-speak.
But there is an irony. Robert Gordon and others argue that technological change and, thus, technology-induced productivity increases have actually been slower since 1970 than they were in the preceding period. (Gordon; for a helpful summary of this view, check out the EconTalk episode with Patrick Collison) The significant innovation that has happened since 1970 has been primarily focused on a narrow band of technologies, especially Information and Communication Technologies, including computers, the Internet, and smart phones, and as we’ll see, these technologies have not paid off economically in the way that many people predicted. Contrast these post-1970 changes with technologies introduced between 1870 and 1970: automobiles, mass production, electricity, telephones, aviation, modern sanitation, modern chemical, pharmaceutical, and petroleum industries, electronics, and . . . computers.
Since 1970, there has been more innovation-speak than actual innovation. The chatter and promises have not delivered.
3. Meanwhile, Globalization — Especially the Rise of China — Has Crushed US Manufacturing
Meanwhile, globalization has eaten away at US manufacturing, particularly since Chinese export-focused production came online in the 1980s and 1990s. Economist David Autor and co-authors have given what they see as a conservative estimate that the so-called “China Shock” was responsible for 25% of the decline in US manufacturing between 1990 and 2007 and 45% of the decline between 2000 and 2007. Clearly this shift hit some American communities harder than others. Others have argued that the ramp up in global manufacturing capacity since World War II has led to a state of near-constant overproduction, cutting into profitability and growth internationally. (Brenner, Benanav) Nothing has arisen to replace manufacturing as a source of high-quality jobs, especially for people without college degrees.
The consequences of these changes for the US working-class have been profound. Anne Case and Angus Deaton have demonstrated increased deaths from suicide, alcoholism, drug addiction, and other “deaths of despair” amongst non-college educated whites. And if anything, deindustrialization has been harder on black communities, which always had a higher percentage of working-class members and since the 1970s have faced persistent joblessness. (Wilson, Taylor, Jain and Hammond)
4. Current Technological Change Isn’t . . . Changing this Picture Much
We live in an era of extreme technology hype, especially about digital systems, but current technological change appears to be doing nothing to alter the basic economic picture I have outlined above. Most current change appears to focused on a) entertainment and fucking around on the Internet (Facebook, Netflix, YouTube, etc.) and b) increasing the convenience of consumption (Amazon, DoorDash, etc.). Moreover, many celebrated digital firms, including Uber, Lyft, and DoorDash, skirt labor law and categorize workers as temporary contractors (as opposed to employees), further undermining job quality in the nation. (Hyman, Dubal)
Meanwhile, many of these companies — Uber again, Lyft again, DoorDash again, WeWork, the list goes on — are not profitable. Indeed, the profitability of startups under ten years old has declined since the 1980s. Technology scholar Jeffrey Funk has demonstrated that, of 130 “unicorn” startups (firms valued at more than $1 billion) in the USA in 2019, only 18% are profitable as opposed to 80% of such firms in the 1980s. Today’s firms are unprofitable for lots of reasons, but one of them is probably that the technologies they are offering simply aren’t that revolutionary. Yet, these hyped-up companies remain juiced by venture capital and by money from Saudi Arabia, China, Japan, and other foreign nations.
Furthermore, technologies that were recently promised to create significant economic change — most centrally “AI” and “robots” — are turning out to be puffery and bullshit. (I always put “AI” in scare quotes because it is often more about marketing than technological reality. As Yarden Katz and others have argued, companies, including Google, rebranded their efforts as “AI” when the term became hot around 2017–2018. ) Examining 40 “AI” firms, Jeffrey Funk has estimated that it will take decades for them to have any marked effect on productivity by, for example, increasing the efficiency of offices. (See also Benanav) Recent reports predict that “AI” will not lead to significant near-term changes in employment. (Keystone, MIT) Some researchers suggest that we may be entering a new “AI Winter,” a period of decreased funding in the area, or at least an “AI Autumn,” as exuberance for the technology fades and expectations come back to earth. Moreover, Susan Houseman has shown that we have been overestimating the effect of automation on manufacturing employment for decades.
Claims that we are living in a “Second Machine Age” or a “Fourth Industrial Revolution” have turned out to be stale flatulence. There’s no reason to believe that the near-future economy will be much different than the one we have today or that current technological change will get us out of our current quagmire of lousy jobs and underemployment.
5. Also, Our Current Way of Doing Things Is Unsustainable
Add to all of this that our current ways of doing things are unsustainable. Our systems of production, transportation, and consumption spew too many greenhouse gases, speeding along global climate change, and also have other dire environmental consequences. Moreover, we have spent the last half century building infrastructure that we have not maintained and have no realistic plan to do so. (Marohn, Vinsel/Russell) And now the COVID-19 pandemic has done God knows what to the long-term health of the economy.
6. The Mainstream of the Two US Political Parties Have No Answers — In Fact, They Aren’t Even Trying
We can imagine various ways of responding to this overall picture, including retooling the economy and creating jobs programs in the name of environmental improvement (aka the “Green New Deal”) or . . . like . . . socialism in pursuit of a post-scarcity society. (Benanav) I am a lifelong Democratic Party voter, so I think policies like universal health care, increasing federal R&D funding for science and engineering, and making public university education cheap or even free would help.
But the simple reality is that neither of the two primary US political parties are really tackling the heart of the matter of our economic reality. Instead, we get business as usual platitudes about economic growth, jobs, and “innovation,” even though our economic conditions explain at least part of the rise of both right- and left-wing populism. (I am nearly sent over the edge when I hear yet another person say that “education” is the primary solution to this set of problems. My wife also has to hide the pills if I hear anyone say that the answer is “startup incubators,” “innovation campuses,” or “entrepreneurship centers.”) There’s nothing on offer from the parties that really grapples with secular stagnation, persistent low wages, the evaporation of good jobs, and the fact that we aren’t living in the mid-20th century anymore.
7. A Future
Reading science fiction is a shitty way to think about public policy. Indeed, Morgan Granger and David Keith have argued that detailed scenarios lead us to underestimate uncertainty and overestimate our confidence: when we buy into specific narratives about the future we expect them to unfold in that way.
But there’s one novel that has stuck with me over the past five years as I’ve read the works cited above. Part of William Gibson’s The Peripheral describes a near-future time where people live in a constant state of unemployment and the only available jobs involve making illegal drugs. Sure, they have much better virtual reality systems than we have today, but — to simplify a bit — most of American society lives in a van down by the river. There is no hope for these people, just as there is no hope for many who live in our nation today.
I think about Gibson’s book when I examine and reexamine the picture I’ve drawn in this post and as I watch a pandemic devastate our economy. And I wonder to myself, what future do we want?