The System of the World

In the history of human technology, the last two decades have been unprecedented. During my research career in astrophysics, the size of the computational problems increased about a factor of a million. On the hardware side, the ASCI Red supercomputer installed in my home state of New Mexico was the fastest in the world in 1996. It was the first to run a benchmark at over a Teraflop (trillion operations per second) and cost a staggering $70 million in today’s dollars.

Representing Descartes Labs at the annual supercomputing conference last week, we showed the same benchmark can be run today at the same performance by anyone using a couple of Google Cloud Compute Engine nodes for a cost of about $1 an hour. This represents a reduction in price of more than a factor of 2,000 in 20 years.

These increases in computer performance are usually too intangible for most people to appreciate. Imagine if the same factor was applied to cars. You could buy a nice vehicle for $10. Rockets? Staying in a space station would be comparable to the cost of a high-end hotel suite on Earth. One place you do have everyday contact with these technological factors is probably in your pocket right now. A modern smartphone is equal or better than a computing, graphics, camera and communication system that would have cost a million dollars 20 years ago. Not to mention there are 2 billion of them, and they can all be connected to each other and the Internet.

This extreme scale of improvement in computing technology begs the question, why haven’t we seen it driving similar advances in other industries? The complexity of this topic is beyond the scope of the current discussion, but we can examine some relevant factors. An important issue is that at a fixed cost, these huge computational performance increases have come about from using more processors, not making each processor faster. In the words of René Descartes, “Divide each difficulty into as many parts as is feasible and necessary to resolve it.” Unfortunately, moving from serial to parallel computation forces essential changes to the software underlying almost any task, thereby making the realization of these hardware performance increases conditional on the development of software that can support parallel computation. “Make the code work in parallel” is easy to say, but hard to do.

Right now, one dollar will buy 6000 trillion mathematical operations in a cloud computing system, but will only buy about one minute of programming labor.

As Erik Brynjolfsson has pointed out, the flow of information will speed up dramatically in highly automated parts of a process, only to hit bottlenecks elsewhere, particularly where humans are involved and processes haven’t been updated. This limit pops up everywhere, from programming to analysis to management. For a programmer, productivity is ultimately measured by the output of the hardware running their software. If the hardware is very expensive, the cost of the software may not be a limiting factor for the combined system, and increases in hardware performance directly translate to better system performance. However, in today’s world, the cost of computing is essentially free compared to the cost of a programmer. Right now, one dollar will buy 6000 trillion mathematical operations in a cloud computing system, but will only buy about one minute of programming labor.

In broad outline, this gives us a context for understanding the rapid growth in US productivity from 1994–2004 and its subsequent decline. Making single-processor computers faster made combined human-computer business systems more efficient up until 2004, when single-processor performance stalled, and overall system costs became dominated by human factors. Some systems continued to prosper, most obviously those associated with the World Wide Web that could be naturally adapted to parallel systems.

Which brings us to the biggest opportunity of the modern era. There is an enormous gap between available computational capabilities and the application of these capabilities to valuable services (outside of a few well-known examples such as advertising enabled by web search or social media). For the right problems, organizations that can work with petabytes of data in a massively parallel cloud platform and efficiently apply decades of research in computer science, mathematics and the fundamental physical sciences can suddenly unlock the unrealized value of computing that has been growing in many industries for over a decade.

My colleagues and I presented some specific examples of this opportunity in our paper “Data-Intensive Supercomputing in the Cloud: Global Analytics for Satellite Imagery.” A typical image analyst will spend most of their time locating images, downloading them, getting them in the right reference system, cleaning up the data (removing actual clouds, for example), and then looking at each image. In contrast, Descartes Labs has built a platform that enables machine learning techniques to be applied to normalized imagery from multiple satellite constellations in the cloud, making the image analyst many times more productive. These same tools are used internally at Descartes Labs to enable our global commodity crop production forecasts, as well as a suite of other services enabled by our forecasting platform.

In 1687, Isaac Newton published his masterwork, Principia, laying the groundwork for modern science and engineering. He positioned physics as “The System of World” (his title for the third book of Principia) and made physics the lens through which we viewed almost everything. In the following 300 years, scientists thoroughly explored physics (having been trained as a physicist, I am bound to lump other sciences such as chemistry and biology into physics). Engineers transformed the resulting discoveries into myriad wondrous inventions. Scientists and engineers have been so successful that further practical advances based on better understanding fundamental physics are likely to proceed at a much slower pace than in the past. SpaceX has reinvigorated rocket science, but the thrust to weight ratio of their Merlin rocket engine isn’t quite a factor of 2 better than the F-1 engine that was used in the Saturn V rocket 50 years ago. No matter how well Elon Musk can rearrange atoms, we already know the names of all the atoms he might use.

Physical laws of nature set the boundaries for the Industrial Age, but the Information Age has different constraints. Software is replacing physics as the system of the world.

Growth from better understanding physics may have plateaued, but we are in a new era. Physical laws of nature set the boundaries for the Industrial Age, but the Information Age has different constraints. Software is replacing physics as the system of the world. Like physics before it, software opens new domains for science and engineering to explore. It is our lens to see the planet and how it changes every day.