How is Moore’s Law becoming irrelevant in the age of Quantum Computing?

Rose’s Law replaces Moore’s Law as Computing Power achieves exponential growth trajectory

Technicity
Published in
6 min readMay 25, 2019

--

Growing up in the late ’80s — early ’90s was an exciting time when household Computing was becoming the new trend as Personal computers became smaller & cheaper with intuitive graphical interfaces and added capabilities. And then came the Internet which really kick-started the growth of the tech economy. Original computer systems were based on microchips (invented in 1958), which had a number of transistors per square inch on them.

In 1965, Intel co-founder Gordon Moore’s observation of a doubling of transistors per square inch on a microchip got coined into what become famously known as “Moore’s Law.” Moore’s Law deduced that computing power roughly doubled every 12 months while the price got cheaper at the same time.

Intel has relied on Moore’s Law to fuel chip innovation for the past 50 years, but the recent revolutionary developments in Computing technology point towards an end to the transistor-based era.

--

--

Technicity

A devout futurist keeping a keen eye on the latest in Emerging Tech, Global Economy, Space, Science, Cryptocurrencies & more