The repeal of Moore’s Law is good news for software developers.

Ken VanBree
4 min readMar 15, 2016

--

This week’s Economist makes it official, we have entered the post Moore’s Law era. There were rumors to that effect a couple of years ago but now the facts are clear. Transistors, the underlying technology that has fueled the electronic revolution, are no longer getting cheaper. In 1966 Gordon Moore of Intel postulated what became known as Moore’s Law. Simply stated it says that the price of a transistor in the latest silicon process will be cut in half every two years. This exponential decrease in price has driven the computer revolution since 1966. However, since 2012 we have seen no decrease in the price of transistors and in 2015 the price actually increased. This change will have a profound effect on our overall productivity, especially as it relates to software.

Software is a very labor intensive process. Large and complex bodies of code can take years to develop. We have always been able to expect that existing code will run faster on the next generation of computer hardware. We also have been able to expect that today’s software would solve larger and larger problems on tomorrows hardware due to advances in computer architecture and memory size that in turn were fueled by the ever declining price of transistors. The repeal of Moore’s Law cancels both of these expectations. From now on we will need to expend greater programming effort in order to improve the speed and capacity of our software. This is good news for software professionals.

Don’t get me wrong computational power will continue to improve, but we can no longer count on an underlying size and price drop of transistors to fuel that growth. Improvements will come from new ways of crowding computer chips together and innovative ways of reducing power consumption and getting the heat out. In The Economist’s Technology Quarterly of March 12, 2016 Dr. Bruno Michel, IBM’s head of advanced micro-integration, points out that when Watson beat a human Jeopardy champion in 2011 it required 80 thousand watts of power to do it. The biological processor (A.K.A. brain) of the human champion on the other hand required only 20 watts of power. That leads me to wonder how Watson would do on an equal power basis if it had to compete against 4,000 human Jeopardy champions simultaneously.

Since many people have never even seen a transistor it is hard to visualize how the changes in a transistor’s size relate to advances in speed and power consumption of electronic devices. Let me give an example from the evolution of disk drives to help illustrate those changes. Twenty five years ago I was managing a group that developed CAD technology for use in designing computer chips. Our data was stored on disks in a computer room adjacent to our offices. The disks were each the size of a washing machine and stored about a Gigabyte of data on rotating magnetic platters that looked like large layer-cakes. On Sunday Sept. 30, 1990 I had gone into the office to generate some slides for a presentation the next day. When I got there an alarm was going off. It turned out that the air conditioning unit for the computer room had failed and the temperature in the room was approaching 85 degrees Fahrenheit. If it got much hotter the rotating disks would shut down and there was a good chance that the heads would adhere to the disk platters after shut-down and destroy the data. I called our computer room operator and a crisis was averted but we nearly lost the disks that held twelve years of development effort for the group.

Exactly twenty five years later, on Sept. 30, 2015, I nearly lost a Terabyte (1000 Gigabytes) of customer data. The data was stored on a USB powered drive that fit in my shirt pocket. I had taken the disk to a client in order to install some software. On the way back to the office I stopped at a store to pick up some supplies and decided to leave the disk in my car. The next day I looked for the data and remembered where I had left it. I searched my car and panicked when I couldn’t find the disk. After several anxious moments I spotted the disk in it’s black microfiber carrying bag barely visible on the black floor-mat of the car. If anyone had told me in 1990 that the data contained in 1000 wash-machine sized disk drives would one day fit in my shirt pocket I doubt that I would have believed them.

Without the underlying exponential size and cost reduction provided by Moore’s Law, future density and power improvements to computers will need to come from new packaging and cooling techniques. These technologies do not advance at exponential rates. It will take much more brain power in the next 25 years to develop innovative ways to shrink the hardware and improve the software than it did in the years since 1990. I wonder what the computers of 2040 will look like. Will 1000 of today’s refrigerator sized cloud-storage server racks fit into a cabinet the size of a microwave oven? Will we use the excess heat generated by it’s cooling fans to heat our home in the winter? Will Watson’s computer hardware shrink to the size of a human brain and require only 20 watts of power? Will it be powered and cooled by what Dr. Michel of IBM has called “electronic blood”? Will a future Watson be able to sit next to us on the couch and yell out answers to Jeopardy questions while simultaneously beating us at chess like Uncle Ned used to do? I hesitate to predict. What I will predict is that most of the improvement in computational performance and density will come from millions of 20 watt biological processors that will understand and evolve computer hardware and software. I know for certain that many of those biological processors already exist and are training as we speak in schools, universities and companies around the world to shape computer hardware and software evolution in the post Moore’s Law era.

--

--

Ken VanBree

A technologist at heart who is looking for ways to make the US economy work for all Americans while providing a future for America’s children.