Brief Reflections on the 4th Industrial Revolution and the Future of Technological Progress

This post is by Joel Mokyr, the Robert H. Strotz Professor of Economics at Northwestern University

Knowledge and Progress

“Useful Knowledge,” as the concept was defined in the Age of Enlightenment, consists of two largely separable sets of knowledge: propositional knowledge (“knowledge of ‘what’”) and prescriptive knowledge (“knowledge of ‘how’”).¹ Propositional knowledge contains what eighteenth-century writers would call natural philosophy, and what we would call science, but it contains a lot more. Unlike those who argue that the Industrial Revolution before 1850 was more-or-less independent of any scientific knowledge, as well as those who have argued that it was largely driven by scientific knowledge, the relationship between the two varied from industry to industry. Additionally, this relationship is better seen as co-evolving, rather than as causal. As I have argued in Mokyr (2002), it is not just the growth of science alone, or that of technology that was at the base of the Industrial Revolution and the origins of the “Great Enrichment,” but the subtle and complex interaction between them. The crude linear model, in which scientific progress drives technological progress and propositional knowledge drives prescriptive knowledge (technology), is no longer taken seriously. But this does not mean that science was unimportant in the cultural background of the Industrial Revolution (Mokyr, 2016, ch. 15).

Co-evolution means mutual and bi-directional effects. Technology stimulated and abetted science, and not just the other way around. It did so through two main channels.² First, inventions, even when made without much of a scientific understanding, focused the scientists’ attentions on a particular area, wondering how it worked, and thus helping to set the research agenda. Interesting enough, a rough division of labor between Britain and the Continent emerged, in which British (and later American) mechanics and engineers discovered things that worked, and French theoreticians and German chemists uncovered the underlying science. The most famous example is the celebrated essay by Sadi Carnot ([1824], 1986) on steam power which pioneered thermodynamics, by his own admission inspired by watching a steam engine of a type pioneered by Arthur Woolf at work.³ Less well-known, but equally illustrative, is how the Wright brothers’ Kitty Hawk flight in 1903 inspired theoretical physics. Kitty Hawk showed us that heavier-than-air flight was possible, and stimulated work that eventually crystallized into modern formal aerodynamics.⁴

Second, technology affected science through more powerful instrumentation. Human senses limit our ability to make very accurate measurements, to observe extremely small items, and to overcome optical and other sensory illusions. Moreover, the human mind has limited computational ability. Technology helps us overcome the limitations that evolution has placed on us, and learn of natural phenomena we were not meant to see or hear — what Derek Price (1984) has called “artificial revelation.” Much of the seventeenth-century Scientific Revolution was made possible by better instruments and tools: the great trio of the telescope, the microscope, and the barometer. In the eighteenth century, new equipment — such as Laplace’s calorimeter and Volta’s eudiometer — enabled the insights that created modern post-Lavoisierian chemistry. A combination of improved microscopy and better lab techniques made the discovery of germ theory possible, arguably one of the greatest advances in medicine of all time (Bracegirdle, 1993, pp. 112–114).

In the twentieth century, examples that demonstrate the impact of better instruments and scientific techniques multiply. One of the greatest heroes of modern science is X-ray crystallography. This technique has been instrumental in discovering the structure and function of many biological molecules, including vitamins, drugs, and proteins. Its most famous application was no doubt its discovery of the DNA molecule’s structure, but its use has been instrumental in twenty-nine other Nobel-prize winning projects (International Union of Crystallography, 2017).

Needless to say, scientific progress was not the result of instruments alone. Institutions, alongside instruments, support the process of innovation. Two big hurdles have always impeded, and often thwarted, these processes. One is the well-known appropriability issue in knowledge-creation. The other is the resistance of conservative elements and vested interests. European society created an institutional system that overcame these obstacles. How and why that system came about in early modern Europe is discussed at length in Mokyr (2016). But technological progress in a world of artisanal tinkering and learning-by-doing alone, without any insights from the sphere of formal, codified knowledge (theoretical or experimental), would eventually have run into diminishing returns. The economy would have settled in a stationary state, rather than the self-enforcing dynamics of ever-accelerating technological progress.

What makes an Industrial Revolution?

The first Industrial Revolution (1760–1830) is and will always be “the” Industrial Revolution. It stands as a true watershed in human history, as the beginning of modern growth and the end of the Malthusian regime. While some episodes of growth undeniably occurred before, they tended to be slow, ephemeral, and reversible. The growth that started during the Industrial Revolution was fast, sustained, and turned out to be amazingly resilient. Yet there is also agreement that this was followed by a second Industrial Revolution, which started in the last third of the nineteenth century, and continued more-or-less until 1914. A third Industrial Revolution has been increasingly associated with the ICT advances of the 1980s: above all, personal computers, the internet, cellular phones, and satellite communications.

When discussing Industrial Revolutions, scholars have often singled out the appearance of one or a few new techniques that connect to hundreds of uses and create major technological externalities beyond their own sector. In the first Industrial Revolution, this so-called General Purpose Technique (GPT) was steam power. Steam power found uses in coal mines, transportation, manufacturing, milling, metalworking and more. In the second Industrial Revolution, the GPT often singled out is electricity, along with mass-produced steel and interchangeable parts engineering. The third Industrial Revolution depended on electronics, above all microprocessors, which have been widely regarded as a GPT similar to electricity. By that logic, most scholars seem to think that AI may be the most likely candidate to carry the 4th Industrial Revolution.

GPTs rarely consist of wholly new ideas. Rather, they are based on ideas that all of a sudden experience a critical breakthrough, opening new opportunities on a massive scale. Savery and Newcomen experimented with steam power in the early eighteenth century, but its widespread use became possible only with the improvements introduced by Watt and his colleagues. Electricity was a subject of much experimentation in the eighteenth and early nineteenth century. Many physicists slowly advanced their understanding of its laws, and used it for limited purposes (such as the telegraph). Steel had been known for many centuries, but not until the Bessemer process (1856) was its cost low enough for general use. Many of the economic effects of GPTs take years, even decades, to become fully operational. Their central roles are recognized often only in retrospect. This may well be the case for AI, which has been on our collective minds since the 1950s. However, only in recent years has AI become a potential economic game-changer: not just by its direct effect on production technology, but with its ability to generate new techniques and improve existing ones (Cockburn, Henderson and Stern, 2018).

Yet it could be misleading to focus on AI alone as the paradigmatic invention driving the 4th Industrial Revolution. Its potential to connect to a multiple of uses, from entertainment to medicine to manufacturing, seems very promising and possibly revolutionary. However, this seems difficult to predict at this point.⁵ Its most promising applications, beside entertainment, seem to be in transportation and medicine, yet it is far too early to assess its impact. On the other hand, as in the late nineteenth century, other advances in a host of areas — ranging from nanotechnology and genetic engineering, to manageable nuclear fusion and 3-D printing — may turn out to be just as epochal, or conceivably more so. Some of the more promising technological avenues may turn out to be dead-ends or socially unacceptable, whereas others may be underestimated at this time. What is critical to stress is that much of the new technology will affect not only productivity directly, but will affect the way science is practiced, and the way the physical environment around us is understood.

Such better understanding may open technological avenues that cannot be imagined more than, say, the effects of the quantum mechanics could be imagined in 1900, when Planck first proposed his famous postulate to the German Physical Association. New knowledge, and the disruptions it implies, are the source of considerable fear. Much like the first and second Industrial Revolutions, it is inevitable that the new technology will disrupt and displace significant numbers of people. Etzioni points out that “our response [to radical innovations] has been driven by not knowing what impact the new technology will have on our sense of self and our livelihoods. And when we don’t know, our fearful minds fill in the details.” Yet those fears should not be allowed to obscure what every undergraduate taking a course in economic history already knows. The material conditions of life for most of human history were, by our standards, appalling. What has liberated humans from hunger, cold, disease, and insecurity is better, more useful knowledge. The process is still ongoing, and much work still remains, including dealing with the unanticipated consequences of earlier technological breakthroughs (think: pollution of our oceans and climate change). The technology promised by the fourth Industrial Revolution may continue this path toward a better material life for more and more people — that is, if our politicians will let it.

How Likely is a Fourth Industrial Revolution?

If this model can be extrapolated out of the historical sample space (a big “if”), there is reason to believe that the fourth Industrial Revolution (or perhaps the fifth) will be truly transformative. The case I am making here — that technological progress has not been exhausted and will not slow down — does not depend on one area of technology or another. It is based on the observation that technology pulls itself up by the bootstraps by giving scientific researchers vastly more powerful tools to work with. Some of those tools have been known in more rudimentary form for centuries, while others are radical innovations that have no clear-cut precursors. Of the traditional tools, the microscope is the most prominent one, as it is basic to the ubiquitous tendency of modern technology toward miniaturization: to operate at smaller and smaller (nanoscopic levels). The Betzig-Hell super-resolved fluorescent microscope, whose developers were awarded the Nobel Prize for chemistry, is to Leeuwenhoek’s microscope as a thermonuclear device is to a fourteenth-century fire bomb. More or less the same can be said for telescopy, where the revolutionary Hubble telescope is soon to be replaced by the much more advanced James Webb space telescope.

Freeman Dyson has remarked that if the twentieth century was the century of physics, the twenty-first century will be the century of biology.⁶ Recent developments in molecular biology imply revolutionary changes in humans’ ability to manipulate other living beings. Of those, the one that stands out is the decline in the cost of sequencing genomes at a rate that makes Moore’s Law look sluggish by comparison.⁷ Especially promising is the technique to edit a base pair in a genetic sequence thanks to recent improvements in CRISPR Cas9 techniques (Belluz and Irfan, 2017). The other is synthetic biology, which allows for the manufacturing of organic products without the intermediation of living organisms. The idea of cell-free production of proteins has been around for about a decade (Zhang, 2009), but has only recently has its full potential become known to the public even if it is still years away (The Economist, 2017a).

Ecclesiastes notwithstanding, there is much under the sun that is entirely new. The two most powerful scientific tools that only have become available in the past decades and that represent complete breaks with the past are fast computing (including unlimited data storage and search techniques) and laser technology. Both, of course, have found innumerable direct applications in the economy, in the production of both capital and consumer goods. But their full long-run impact on productivity is underestimated by concentrating on annual total factor productivity (TFP) growth, because those figures (among their other faults) fail to account for the indirect effect of those techniques on research that may lead to technological advances in an entirely different area in the future.

Above all, the impact of fast computers on science has gone much beyond large-scale calculations and standard statistical analysis: a new era of data-science has arrived, in which models are replaced by powerful mega-data-crunching machines, that detect patterns that human minds could not have dreamed up and cannot fathom. Such deep learning models engage in purely inductive data-mining using artificial neural networks. Much as the James Webb is to Galileo’s first telescope, the huge data banks of mega crunchers are to Carl Linnaeus’s notebooks.⁸

But computers can do more than crunch data: they also simulate, and by so doing, they can approximate the solution of fiendishly complex equations that allow scientists to study hitherto poorly-understood physiological and physical processes, design new materials in silicon, and simulate mathematical models of natural processes that so far have defied attempts at closed-form solution. Such simulations have spawned entirely new “computational” fields of research, in which simulation and large data processing are strongly complementary in areas of high complexity. Historically, some scientists may have dreamed about such a tool, but it is only the most recent decade that they will have the power to do this at a level that will inevitably vastly augment our technological capabilities.

Laser technology is an equally revolutionary scientific tool. When the first lasers were developed, it was said, their inventors thought it was a technique “in search of an application.” But in the 1980s, lasers were already used for cooling micro samples to extraordinarily low temperatures, leading to significant advances in physics. Nowadays, the deployment of lasers in science has a dazzling range.⁹ LIDAR (light radar) is a laser-based surveying technique which creates highly detailed three-dimensional images used in geology, seismology, remote sensing, and atmospheric physics.¹⁰ But lasers are also a mechanical tool that can ablate (remove) materials for analysis. For laser ablation, any type of solid sample can be ablated for analysis; there are no sample-size requirements and no sample preparation procedures. In another area, laser microdissection combines powerful microscopes and lasers to isolate and procure subpopulations of tissue and single cells. Among the laser’s many other uses, laser interferometers have been used to detect the gravitational waves Einstein postulated, one of the holiest grails in modern physics.

Much like the new instruments and tools of the seventeenth century rang in the scientific revolution and the age of steam, high-powered computers and lasers will lead to technological advances that cannot be imagined any more than Galileo could foresee the locomotive. If the recent economic history of technology teaches us anything, it is that mindlessly extrapolating on the past is a poor guide to the future. After millennia of very slow and reversible growth, in the past two centuries, the world has taken off on a path of unprecedented economic expansion, driven by useful knowledge and human ingenuity. As our ability to understand natural phenomena expands, so will our ability to harness nature to our needs. The cliché that “everything that can be invented has been invented already” is now itself a cliché. However, the question as to whether the positive feedback mechanism driving technological progress will eventually run into diminishing returns and slow down is — and will probably remain — unresolved.

Footnotes

¹ Michael Polanyi points out that the difference boils down to observing that propositional knowledge can be “right or wrong,” whereas prescriptive knowledge is “action [that] can only be successful or unsuccessful.” (1962, p. 175).

² The nexus from technology to science was formulated in an especially compelling fashion by the late Nathan Rosenberg (1982).

³ Carnot’s now famous book was at first ignored in France. However, it found its way second-hand — and through translation — into England, where there was considerably more interest in his work by the builders of gigantic steam engines, such as William Fairbairn in Manchester and Robert Napier in Glasgow (Smith, 1990, p. 329).

⁴ Fifteen years after Kitty Hawk, the German theoretical physicist Ludwig Prandtl published his great work on how wings could be scientifically — rather than empirically — designed, and the lift and drag precisely calculated (Constant, 1980, p. 105; Vincenti, 1990, pp. 120–25). Prandtl’s work, alongside his peers, led to improvements in the design of wings and much improved airplane streamlining (Lienhard, 2008).

⁵ As Oren Etzioni, CEO of the Paul Allen institute for AI put it, “We’re at a very early stage in AI research. Our current software programs cannot even read elementary school textbooks, nor pass science tests for fourth-graders. Our AI efforts today lack basic common-sense knowledge (gravity pulls objects toward earth), and cannot understand without ambiguity seemingly simple sentences such as: ‘I threw a ball at the window and it broke.’” (Etzioni, 2014).

⁶ Dyson (2015, pp. 2–3) suggests a future in which the tools of genetic engineering becomes available to individual breeders: “there will be do-it-yourself kits for gardeners … also kits for lovers of pigeons and parrots…breeders of dogs and cats will have their kits too…domesticated biotechnology … will give us an explosion of diversity of new living creatures … new lineages will proliferate to replace those that monoculture farming and deforestation have destroyed.”

⁷ The sequencing cost per genome has declined from $95 million per genome in 2001 to about $1250 in 2015 (NIH, 2015).

⁸ Ernest Rutherford’s possibly apocryphal quip that “all science is either physics or stamp collecting” has been interpreted as a put-down of inductive science. But in fact, the accumulation of facts — and the search for patterns and regularities — has always been an integral part of any serious investigation of nature.

⁹ This technique was awarded the Nobel Prize in Physics in 1997. One of the winners was Steven Chu, who later served as Secretary of Energy in the Obama administration.

¹⁰ LIDAR has recently helped uncover further archaeological information about the sophisticated Mayan civilization in what is now Guatemala. The Mayans were far more advanced and sophisticated than has hitherto been suspected.

--

--