Johannes gutenberg’s printing press

Nate Silver: The Productivity Paradox

We face danger whenever information growth outpaces our understanding of how to process it. The last forty years of human history imply that it can still take a long time to translate information into useful knowledge, and that if we are not careful, we may take a step back in the meantime.

The term “information age” is not particularly new. It started to come into more widespread use in the late 1970s. The related term “computer age” was used earlier still, starting in about 1970. It was at around this time that computers began to be used more commonly in laboratories and academic settings, even if they had not yet become common as home appliances. This time it did not take three hundred years before the growth in information technology began to produce tangible benefits to human society. But it did take fifteen to twenty.

The 1970s were the high point for “vast amounts of theory applied to extremely small amounts of data,” as Paul Krugman put it to me. We had begun to use computers to produce models of the world, but it took us some time to recognize how crude and assumption laden they were, and that the precision that computers were capable of was no substitute for predictive accuracy. In fields ranging from economics to epidemiology, this was an era in which bold predictions were made, and equally often failed. In 1971, for instance, it was claimed that we would be able to predict earthquakes within a decade, a problem that we are no closer to solving forty years later.

Instead, the computer boom of the 1970s and 1980s produced a temporary decline in economic and scientific productivity. Economists termed this the productivity paradox. “You can see the computer age everywhere but in the productivity statistics,” wrote the economist Robert Solow in 1987. The United States experienced four distinct recessions between 1969 and 1982. The late 1980s were a stronger period for our economy, but less so for countries elsewhere in the world.

Scientific progress is harder to measure than economic progress. But one mark of it is the number of patents produced, especially relative to the investment in research and development. If it has become cheaper to produce a new invention, this suggests that we are using our information wisely and are forging it into knowledge. If it is becoming more expensive, this suggests that we are seeing signals in the noise and wasting our time on false leads.

In the 1960s the United States spent about $1.5 million (adjusted for inflation) per patent application by an American inventor. That figure rose rather than fell at the dawn of the information age, however, doubling to a peak of about $3 million in 1986.

As we came to more realistic views of what that new technology could accomplish for us, our research productivity began to improve again in the 1990s. We wandered up fewer blind alleys; computers began to improve our everyday lives and help our economy. Stories of prediction are often those of long-term progress but short-term regress. Many things that seem predictable over the long run foil our best-laid plans in the meanwhile.

Notes and Bibliography

Google Books Ngram Viewer.
—Susan Hough, Predicting the Unpredictable: The Tumultuous Science of Earthquake Prediction (Princeton: Princeton University Press, Kindle edition, 2009), locations 862–869.
—Robert M. Solow, “We’d Better Watch Out,” New York Times Book Review, July 12, 1987.
—“U.S. Business Cycle Expansions and Contractions,” National Bureau of Economic Research.
—I use the number of patent applications rather than patent grants for this metric because patent grants can be slowed by bureaucratic backlog. Among the only real bipartisan accomplishments of the 112th Congress was the passage of the America Invents Act in September 2011, which was passed by a an 89–9 majority in the Senate and sped patent applications.
—For figures on U.S. research and development spending, see “U.S. and International Research and Development: Funds and Alliances,” National Science Foundation.
—For patent applications, see “U.S. Patent Statistics Chart Calendar Years 1963–2011,” U.S. Patent and Trade Office.


Nate Silver is the author of The Signal and The Noise: Why Most Predictions Fail—but Some Don't