America Isn’t Out Of Ideas. In Fact, A New Era Of Innovation Is About To Begin
Is America out of ideas? Scott Ip of The Wall Street Journal seems to think so. In a recent article, he points out, correctly, that total factor productivity growth has “steadily fallen” since its peak in the 1950’s and 60’s. According to Ip, rising research costs and greater regulatory burdens have reduced our ability to innovate.
“Outside of personal technology, improvements in everyday life have been incremental, not revolutionary,” he writes. “Houses, appliances and cars look much like they did a generation ago. Airplanes fly no faster than in the 1960s. None of the 20 most-prescribed drugs in the U.S. came to market in the past decade.”
This is not a new argument. In fact, economist Robert Gordon makes many of the same points in his book, The Rise and Fall of American Growth. Still, while the issues that both Ip and Gordon raise are very real, they are only part of the story. Innovation is not about the past, but the future and, we may very well be entering a new era of accelerated innovation.
Innovation Is Never A Single Event
We tend to see innovations as events. Thomas Edison invents the light bulb. Alexander Fleming discovers penicillin. Steve Jobs launches the Macintosh. A single spark leads to transformational change and our lives are improved. However, that’s not how innovation really happens.
To see why, let’s look at each of those examples in turn. Edison built his first power plant in 1882, but electricity didn’t begin to affect productivity until the 1920’s. Alexander Fleming discovered penicillin in 1928, but the drug didn’t become commercially available until 1945. The ideas that led to the Macintosh were first presented in 1968, but computers didn’t impact productivity growth until the late 1990’s.
The truth is that innovation is never a single event. First, scientists need to discover new phenomena, like Fleming’s early work with antibiotics. Then those phenomena need to be engineered into a viable solution to an important problem. Finally, the new technology needs to be be adopted by the marketplace. This process of discovery, engineering and transformation usually takes about 30 years.
And that’s the problem with the arguments of both Ip and Gordon, they focus their analysis only on the final stage of transformation, when the process is already underway, but hasn’t yet begun to show up in economic statistics. If we really want to understand what lies ahead, we need to start with discovery.
How America Became Exceptional
America wasn’t always a technological superpower. In fact, at the turn of the 20th century, the United States was largely a scientific backwater. The dearth of knowledge was so great that promising young scientists often went to Europe to complete their doctorates.
Three factors changed the tide. First, the economic boom of the 1920’s led to new endowments for scientific institutions, such as the Institute for Advanced Study in Princeton. Second, the rise of fascism in the 1930’s created a wave of immigration, including many top scientists. Third, Vannevar Bush published his report, Science, The Endless Frontier, which led to government funding of basic research. That’s how America became exceptional.
As Bush explained in the report:
Basic research leads to new knowledge. It provides scientific capital. It creates the fund from which the practical applications of knowledge must be drawn. New products and new processes do not appear full-grown. They are founded on new principles and new conceptions, which in turn are painstakingly developed by research in the purest realms of science.
Take a look at any significant invention today and much of the core technology began with a government grant. Early computers were funded by the military. DARPA created the Internet. Most blockbuster medicines are the result of research performed at the NIH. The shale gas boom came from a program at the Department of Energy. Google itself began as a grant from the National Science Foundation.
So while economic metrics like total factor productivity are important, they can also be very misleading. To understand the whole picture, you also need to look at nascent technologies and evaluate their potential impact.
The Next Wave: Genomics, Nanotechnology and Robotics
Today, there are three emerging technologies that have significant potential to have profound impacts over the next decade: Genomics, nanotechnology and robotics. Each of these has transformational potential, but taken together they may very well lead to a new era of prosperity, much as electricity and the internal combustion engine did in the last century.
To understand the impact that these technologies can have, consider the case of solar energy, which relies on nanotechnology. Since 2009, the price of solar panels has dropped by 70%. That’s made them competitive with fossil fuels, but not transformative. Now consider the fact that solar efficiency improves by about 20% for every doubling of volume and you can see the potential for the future.
And that’s the fundamental flaw in Ip’s analysis. You can’t judge the future by looking at the innovations of the past. Yes, conventional cancer therapies like chemotherapy and radiation aren’t improving very fast, but Cancer Immunotherapy has become a “fourth pillar of treatment.” Moore’s law is indeed slowing down, but new architectures such as quantum computing and neuromorphic chips have the potential to be thousands of times more powerful.
So the problem is not necessarily that we don’t have enough innovation, but that we need a plan to maximize the transformational impacts of the innovations we create.
Getting From Innovation To Transformation
In his article, Ip cites two primary causes for the lull in productivity, the rising costs of research and increased regulation, neither of which ring true. Research costs may indeed be rising, but as he also notes, the rate at which patents are granted is accelerating. And while regulation does have the potential to slow innovation, sluggish productivity is a global phenomenon, so it’s hard to see how US legislation can be to blame.
What we need is to refocus our energies on solving grand challenges, rather than merely to disrupt markets. To start, we should restore government funding of basic science back to where it was in the early 1970’s. While this will cost less than 0.5% of GDP, it can have outsized impacts. To take just one example, the $3.8 billion invested in the Human Genome Project generated nearly $800 billion of economic activity as of 2011.
Another promising avenue is to better integrate public and private efforts through building consortiums, such as the JCESR program at Argonne National Laboratory that is building next generation batteries and the National Network for Manufacturing Innovation that is setting up manufacturing hubs around the country. We need more efforts like this as well as the Innovations Corps program that helps train government grantees in lean startup methods.
Clearly, America is not out of ideas. In fact, we may very well be on the brink of a new era of innovation and prosperity. What we really need is more ideas about how to translate our scientific prowess into tangible economic impacts.
An earlier version of this article first appeared in Inc.com