What the End of Moore’s law means for the future of Computing

Rohan Kar
3 min readJul 26, 2016

--

It’s hard to believe how far we have come in information technology over the past 50 years. From massive room-sized monoliths to incredibly quick devices that fit in our pockets or on our wrists. We attribute much of this to the progress in building Integrated Circuits (IC’s) popularly known as computer chips. Furthermore it's amazing to think how accurate Gordon Moore was when he prophesied all of this in 1965.

“In brief, Moore’s Law predicts that computing chips will shrink by half in size and cost every 18 to 24 months. For the past 50 years it has been astoundingly correct.” –Kevin Kelly, What Technology Wants

In its 50 years of existence Moore’s Law has become the cornerstone of modern computing.

Why Does this Matter Now?

The semiconductor industry has been a highly organized one and for a long, long time coordinated its efforts in terms of innovation and economics to match the path set by Moore’s Law. But for the first time in over 50 years they have chosen to not follow the roadmap.

According to the International Technology Roadmap for Semiconductors (ITRS), recently released by the Semiconductor Industry Association (SIA), 2021 is the year that it will no longer be feasible to continue shrinking transistors for use in microprocessors.

But according to some like Ray Kurzweil, the failure of Moore’s law is just the end of exponential growth of 1 of the 5 paradigms of computing, and definitely not the first. Meaning, there is hope for a 6th and new paradigm of computing.

The chart from Ray Kurzweil’s book “The Singularity is Near(2005)” shows the exponential progress of computing across 5 technologies — electromechanical, relay, vacuum tube, discrete transistor computing elements and integrated circuits.

Yes Moore’s Law is coming to an end, but it marks the end of shrinking transistor sizes on a flat integrated circuit.

What Now?

Now we wait, for the Laws of Thermodynamics and the Laws of Quantum Mechanics to exhaust the possibilities of using silicon chip technology.

The truth is, it’s uncertain what will become the definitive 6th paradigm of computing. Protein Computers, DNA computers, molecular computers, Quantum computers, optical computers, take your pick. Many, like 3D molecular computing and Quantum computing, have shown promising results but the jury is out on this one.

What is certain is that it’s going to be near impossible to maintain the exponential growth in computing set forth by the earlier 5 paradigms unless we make a significant breakthrough like we did in the case of silicon chips.

Until that breakthrough happens, the industry and we as humans will organically shift focus to applications and technologies that are less pivoted on instructions per second but rather on pervasiveness, parallel processing and dispersed computing like in the form of Internet Of Things. There is no bigger cliche than- “necessity is the mother of invention” and you are most likely to see that in the way we create the next generation of gadgets, apps and even spaceships.

--

--

Rohan Kar

Product Guy. Grad student @UCBerkeley . Past-PM intern @Twitter. Studies Experimentation, Growth Analytics, and Human-Computer Interaction.