Moore’s Law is dead… Is it really?

Daniel Olmedo
6 min readJan 25, 2023

--

Last September, Nvidia’s CEO Jen-Hsun Huang gave a press conference after the many criticisms caused by the price spike of their Nvidia GPUs, contrary to consumers’ expectations that prices would go down after the decline in the demand of GPUs in the past weeks. In case you are not familiar with this topic, let’s put it this way:

There is an ongoing global chip shortage, which combined with the crazy demand of GPUs that manufacturers have experienced in the last years, has caused prices of GPUs to rise exorbitantly.

What has caused such demand of GPUs? Did everyone simply wake up one day and feel a sudden urge to purchase GPUs because they have discovered their inner gamer? Although I am tempted to say yes, that is not the reason. As a matter of fact, the answer has a lot to do with cryptos, specifically Ethereum, and how the hardware used for playing video games is also used for “mining” cryptos.

But in order to understand this connection between GPU demand and cryptos, let’s briefly explain how blockchain works. Blockchain is a decentralized network that uses its members (nodes) to verify every operation. Every time a transaction is performed inside the network, it is validated by sending complex cryptographic operations to these members, also known as “miners”. In return, these miners get paid Ethereums as a reward for solving the transaction “puzzle”.

So what do cryptos have to do with GPUs? Let’s say that GPUs are faster than any other piece of hardware when it comes to solving numerical operations, and, by extension, cryptographic puzzles. Miners have been purchasing consumer GPUs massively in order to create GPU farms and generate profits by mining cryptos. This mining practice became so popular that it ended up saturating the demand of GPUs in the market, which added to the current low supply of chipsets, caused the prices to soar.

In a parallel effort, Ethereum has spent years working on a more energy-efficient solution to validate transactions in order to cut the energy required to mine, since the Ethereum network consumed the equivalent to the energy of a whole country. As a result, on September 15th “The Merge” was released, which meant that GPUs were no longer useful to validate transactions, and gamers, who were expecting a price reduction in the GPU market, saw a decrease in pricing for second-hand GPUs. However, prices did not decrease for first-hand GPUs, which resulted in many complaints circulating online.

Going back to Jen-Hsun Huang’s press conference, he responded to criticisms with the following statement:

A 12-inch [silicon] wafer is a lot more expensive today than it was yesterday, and it’s not a little bit more expensive, it is a ton more expensive,” (…) “Moore’s Law’s dead,” (…) “And the ability for Moore’s Law to deliver twice the performance at the same cost, or at the same performance, half the cost, every year and a half, is over. It’s completely over, and so the idea that a chip is going to go down in cost over time, unfortunately, is a story of the past.”

A bit of history:

If you have never heard about Moore’s Law, let’s go back 80 years in time. When humans first launched ENIAC, the first digital computer, in 1946, it was about 170 square meters and weighed 50 tons. It used about 18,000 vacuum tubes, each one with two states (0 or 1), meaning that it had 18,000 bits of memory — remember when floppy disks were a thing? Well, their memory was approximately 12,000 times more than ENIAC’s.

Gordon Moore, the co-founder and former CEO of Intel, made an observation that was so popular, it became a rule:

“According to Moore’s Law, the number of transistors on a chip roughly doubles every two years. As a result, the scale gets smaller, and transistor count increases at a regular pace to provide improvements in integrated circuit functionality and performance while decreasing costs.

[Fun Facts, exactly how small is 22 nanometers — Intel]

Roughly speaking, Moore is claiming that every two years humans have managed to manufacture microchips twice as powerful, while halving the costs, and that this rule would continue to apply for years to come. But according to Nvidia’s CEO, Moore’s Law has reached a point of inflection because we cannot physically introduce more transistors inside the same square millimeter; so in his words, Moore’s Law is dead.

But is this law really dead?

In my opinion, to state that Moore’s Law is “dead” is not entirely accurate, as it must be looked at in context. Indeed, the performance span of microchips has lengthened to 2.5 years; however, it has not reached a dead end. We could agree with Huang’s statement to a certain extent by noting that: Moore’s law is dead only from a physical perspective, given that it is not physically possible to place twice as many transistors in the same space.

However, Moore’s Law is still very much alive in the sense that nowadays humans are achieving the same (if not better) performance by resorting to alternative solutions that increase computing power at equal cost, without the need for more transistors. These solutions include hardware stability, better software-hardware integration, better heat dissipation, larger connector bandwidth, among others. The most exciting aspect is that all of these solutions are possible thanks to the cloud paradigm, a relatively new scenario that enables millions of microchips to be linked and better perform shared tasks. The mind-blowing fact is that, according to “The Cloud Revolution — Mark Mills”, not only has the cloud maintained the performance set out in Moore’s Law, but it has even increased it, making significant advances in less than 2 years.

How does this new paradigm impact cloud gaming?

What we have observed in the past decade is that services we obtained through the use of physical devices, such as iPods, DVDs, hard drives, and others, have become more cost-effective when transitioned to the cloud. We have previously talked about how this transition is also occurring in the gaming space because of the evolution of GPUs inside this new paradigm. GPU manufacturers can take advantage of what this paradigm offers to design new cards with better performance. Keep in mind that Nvidia’s CEO has stated that graphics cards are going to get more expensive over time. Therefore, there will come a point where GPU prices will get so high, that cloud gaming will become the only reasonable option for end-consumers to play video games.

After assessing how the cloud will maintain efficiency on computing costs, and by extension, how cloud gaming will save gamers, there’s one last fact that I wanted to save for the end of the drizzle. Did you know that there is a new law that takes these variables into consideration and aims to substitute Moore’s Law? This law claims that a synergy between hardware, software and artificial intelligence is the new route to improve performance on hardware systems. In addition, this law also claims that advancements in graphic processing units (GPU) are growing at a much faster rate than with traditional central processing units (CPU). And do you know how this law is called?… Huang’s Law! Curiously, this law was created by none other than Nvidia’s CEO, a.k.a. the same guy stating that Moore’s Law is dead. Interesting, right?

All of these observations we´ve talked about in this drizzle foresee a very exciting future for cloud gaming, since the cloud paradigm is setting the perfect breeding ground for hardware performance to skyrocket, and GPUs are improving at a significantly faster rate than other processing units in the cloud. As a result, cloud GPUs are and will become a lot cheaper than end-consumer GPUs, which gives cloud gaming platforms like Nware a significant head start in the race over newcomers in the market.

Thank you for reading up to this point. I am thrilled to hear your thoughts.

See you on the next drizzle!

Daniel Olmedo — Co-Founder & CEO
Begoña Fernández-Cid — Co-Founder & CMO

--

--