The computer revolution has not yet started.

I was fascinated by Paul Krugman’s review in the Times today of Robert Gordon’s new book, “The Rise and Fall of American Growth”. As a child of the late 70’s, and a techno-optimist, I’ve often wondered about these questions: After the creation of engines, electricity, chemicals and miracle drugs, and modern means of communication; after the transition from agricultural to post-industrial society, is there any truly meaningful technological progress left? Are we current-day technologists just messing around the edges of the industrial world created by our forbears, gradually developing new video games and bluetooth-connected cars but doomed to mere incremental improvement?

Gordon points out that a middle-class 1940’s lifestyle was very similar to our present condition, just sans television and internet. He goes further to argue that while computers have given us things of value like social networks or desktop publishing, they have not fundamentally altered our material lives in the same way that the “Great Inventions” did, taking us from oil-powered lamps and death-by-infection to electricity and telephones and 75-year-lifespans.

Of course I can’t know the future. But one thing popped out in Gordon’s observations about the rapid improvements in quality of life thanks to the “Great Inventions”. He noticed that there was a lag between the first inventions, say of the steam engine, and the broad material impact of those inventions in people’s everyday lives. The inventions were mostly in the 19th century, while the improvements took place between 1920 and 1970.

Applying that same time lag, between invention and improvement, to the world of computers leads to a surprising conclusion. Most computer inventions took place in the 1960’s and 1970’s, implying that the corresponding improvements will happen in the 2010’s to 2060’s. In other words, the computer revolution has not even started yet, if it will proceed at the same pace as the industrial/pharmaceutical/communications ones before it.

This kind of historical rule might risk overstepping into a kind of Hegelian faux “Science of History”, with some ambitious mathematical formulas for progress. But if this theory of progress is true we’d look for signs of computers impacting material life in new ways — in other words, ways that computers might change our material world, and not just the virtual one.

I’d mention a number of leading indicators: 1) Mundane tasks: The emergence of well-defined artificial intelligence solutions with narrow scope, such as self-driving cars or auto-sorting email. 2) Impossible tasks: Software tools that enable something that didn’t really make sense before, such as passable machine translation between human languages. 3) Computer-designed drugs: The emergence of programmable biology at the atomic scale, a dream for decades that has been made incarnate with the design of new proteins in computers, promising the ability to make more new drugs for more ailments at a lower price. 4) Computer-discovered biology: Algorithms and storage space have enabled so-called “next-generation” DNA sequencing technology that is impacting everything from sustainable chemistry to cancer diagnosis.

Krugman points out some of these possibilities in his review, and some of these technological improvements may not come from computers, such as the CRISPR gene-editing revolution in biology. But following Gordon’s logic we would expect the computer revolution to be, at best, in its adolescence. From my point of view in computational biology, its really just about to start.

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.