Are you familiar with the Moore’s Law?

Understanding exponential growth in computing through Gordon Moore’s prediction.

Gabriel Quintas
5 min readApr 8, 2016

Have you ever wondered how technology has evolved to its current state? Why in recently years, more than ever, has innovation been leading us to unimaginable places we only dreamed about in such a fast pace? Much of this can be explained by, or at least be related with, something called Moore’s Law.

Moore’s Law was first introduced by Gordon Moore, Intel’s co-founder, in 1965. The picture on the left is the actual graph Moore published in his famous article. The graph analysis is really geekie and specific to the semiconductor industry, actually describing that the optimal number of transistors inside integrated circuits was doubling approximately every 2 years and that this pattern would repeat itself in the following years. But what did his prediction really mean?

Well, transistors are miniature electronic components composed with semiconductor material that can both amplify or switch electronic signals. By making them smaller, transistors became cheaper. And a lot more small-sized transistors fitted inside an integrated circuit, increasing the circuit’s density and efficiency. Thus, what Moore’s rule of thumb really determined was the following: device complexity, which can be translated by an increased circuit density with a bigger number of components in it at a reduced cost, was doubling every 2 years. In other words, the establishment of a shrinking cycle for transistors helped develop better, faster and cheaper computers.

Since semiconductors are the best basic materials for use in advanced electronics and communications — they are used in order to fabricate chips for every kind of electronic device, including computers, mobile phones, televisions and videogame consoles — their development played a major role in creating and improving anything computer-driven and, consequently, software-driven. Intel has released chips that fit twice as many transistors into the same space roughly every two years aiming to follow an exponential curve.

Moore’s prediction was indeed right. It’s amazing to see how we’ve been able to transform supercomputers in consumer products along the years. And what’s more awe, this evolution was a refraction of a much longer term and more important trend. Basically, It will be cost-effective to embed a computer in almost anything.

The graph above shows what kind of computational power a thousand dollars can buy over 110 years. The graph goes even before the semiconductor industry. Over 100 years, no event was able to impact the accumulated capacity of computers, not even the world wars.

Moore’s Law is all around us. It describes how exponential growth in computing power through the decades of the information age has been extended to other industries as well. In healthcare and education, for example, industries previously perceived as impenetrable to disruption by technology, the structural changes were huge.

While in computing, an exponential growth in efficiency leveraged the industry’s capacity, another important exponential curve happened in storage. But, in this case, it represented an exponential decay. To contextualize, a startup in the software space would have had to spend $10-$20 million to build up a server farm just to be able to do anything at scale in 2000. Now, AWS (Amazon Web Services) provides large storage and computing capacity for a much more affordable price.

Data storage transformations are directly related with the emergence of cloud computing, a broad term that describes a broad range of services. Cloud computing enabled convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) which can quickly be released to the end user with minimal management effort. Long story short, end users now have the ability to utilize parts of bulk resources quicker and easier than ever before with low interaction with the service provider.

So, what happens when you combine cheaper computing power with cheaper data storage that can aggregate tons of services and data through the cloud? You get big data — which machine learning depends on — which in turn leads to deep learning. Ever heard of it? But that’s a subject for another article.

Unfortunately, there’s a limitation to where the Moore’s Law can lead us. A physical limitation. Intel has put the brakes in new chips-technology launches by increasing the gap of 2 years between successive generations of chips and new, smaller transistors. Since latest chips already have 14 nanometers, it will be hard to keep the shrinking cycle going. But don’t worry, that doesn’t necessarily mean our devices will stop improving.

Although the development of computing power will start lowering down, The Moore’s Law has already led to several new computing and software platforms which are currently under development and will soon be produced in scale, changing the way we interact with our world. Many of these innovations are already part of our lives, such as fintech products and services, edtech platforms and wearables. Others are promising technologies for the near future: virtual reality will become the ruling console for entertainment, augmented reality might cause a mobile disruption, self-driving cars will change mobility forever and artificial intelligence will be responsible for leveraging our exponential curve to a whole new level.

Got interested and want to know more about this game changing environment of new technologies? I strongly recommend you check Stanford’s new class, The Industrialist’s Dilemma, which discusses the transformations from an industrial economy to a digital one. Stop being just a spectator and dive deep in the tech rollercoaster. All the cool kids are already doing it, so why don’t you?

--

--

Gabriel Quintas

Passions: social innovation, education, human mind & digital businesses