Is Quantum-Accelerated Computation the Next Big Disruption?

Looking at graphics processing to understand the future of quantum computation.

Markus Düttmann
The Startup
6 min readAug 21, 2020

--

Fig. 1: Google’s Sycamore processor (Source: Nature)

Many people are looking to quantum computing as the next revolutionary technology. Nature analyzed that in 2017 and 2018 alone, more than $450 million of private funding was poured into the quantum industry. Even the classical finance community starts to smell an opportunity. Xavier Rolet, the former CEO of the London Stock Exchange and well-respected industry veteran, told The Quantum Daily that he considers such investments a solid bet on the future and believes in the transformational change of quantum computers.

If not all, the exciting topic made its way to a more mainstream audience. Even the tabloids have been writing extensively and with very catchy headlines about a Nature article published in 2019. Researchers at Google announced that they achieved what is called quantum supremacy. On their quantum processor named Sycamore (see Fig. 1), they ran some calculations within 200 seconds that would have taken the world’s most powerful (classical) supercomputer 10,000 years — at least they claim. It has to be added that the setup was very specific and the results are heavily debated by competitor IBM. But certainly, the expectation towards the field has been starting to skyrocket.

As smart and quirky physicists move towards the field of quantum computation, build hyped startups and get huge funding, it is very interesting to follow this space. Will we have the chance to see disruptive innovation live and in action?

Disruptive innovation of the GPU

Clayton Christensen’s great book The Innovator’s Dilemma has become standard literature for business students and startup founders all over the world. He describes how disruptive technologies, a now widely used term coined by him in 1995, has impacted the development of modern computer architectures. The success story of the graphical processing unit (GPU) follows exactly Christensen’s ideas and may in some sense be a good analogy to what we are seeing now in the field of quantum computing.

I first heard of GPUs on the schoolyard. The main fight between us teenagers was between believers in 3dfx’s Voodoo or Nvidia’s GeForce graphic card. It enabled us to play what we thought to be extremely realistic games like Quake (follow this link to play the browser version and you will understand our excitement). It is quite obvious, that the gaming interests of me and my schoolmates were something of a niche market and not considered good business back in the days, especially compared to highly-profitable workstations.

Eventually, however, the gaming market took off and grew to $152 billion in 2019. With the growing market, so grew the research budget for GPUs. They became the de-facto standard everywhere and people started to realize the benefits of the parallel computing powers at their hand. In 2017, Nvidia introduced its CUDA platform allowing easy access to the GPU for developers. Being very useful for heavy numerical integrations, it became a major hit in the physics research community and very widely adopted.

Graphics processing does require the ability to solve a math problem known as a floating-point operation or a FLOP. This is something you probably will have heard of when people brag about their supercomputers. It is no coincidence that in 2010, the Pentagon eventually connected 1760 Playstation 3’s to build a supercomputer called Condor Cluster. Even today, the GPU continues to be one of the main drivers of supercomputing performance.

Like most computing power, the GPUs have moved away from our own PCs and gaming consoles to the cloud. There soon will be no need for us to own them anymore. In November 2019, for instance, Google Stadia started to tackle one of the last frontiers of heavy computation at home and plans to make console gaming obsolete while ensuring the same level of breathtaking graphics. After trying out the platform on my laptop at home, I have to admit that we are not quite there yet, but the future is looking bright. I am especially excited to see Microsoft’s or Sony’s answer to this move. Furthermore, we see new uses cases ranging from Deep Learning/AI or GPU databases. The graphical processing unit started from a hacked circuit in arcade games, has become an integral part of our everyday infrastructure.

Disruptive innovation of the QPU

Fig. 1: IBM Quantum computer using superconducting qubits

A quantum computer is built from several components like the control electronics or the quantum processing unit (QPU). It uses two-state systems called quantum bits (Qubits) to store quantum information.

There are various candidates for such systems like trapped ions, quantum dots or even photons, all with their pros and cons. In Fig. 1 you see a quantum computer leveraging superconducting rings as qubits. To reduce so-called quantum decoherence, this specimen usually is cooled down to 1K. While the technologies are advancing considerably and we will definitely see some improvements, quantum computers remain fickle and it is unlikely that we will ever have a small quantum computer at home.

It is also evident that a lot of problems can already be efficiently solved by standard programming approaches. There is absolutely no need for us to give up all the tools we have developed in the last century that have proven successful. It is fair to say that quantum computers are overkill for quite a lot of tasks. However, these machines may be very useful for computation-heavy calculations such as searches in unsorted databases or Fourier transforms.

What we will most certainly see is a cloud setup in which all different components can play to their strength. GPUs will be used when we need to do simple calculations concurrently or render images, CPUs will solve singular tasks and QPUs will take care of some specific calculations as for example the above-mentioned Fourier transforms. I assume that even the logic of choosing the best tool will be abstracted away and happen in the background, keeping the developer unaware of how and where his or her code is running.

A note on quantum supremacy

One often hears the term “quantum supremacy.” It refers to the hypothetical advantage in terms of speed that a quantum computer would have over a classical one. Showing this advantage in a real setup is one of the main goals of companies and researchers in the field and would be a major scientific breakthrough. On a theoretical level, though, it is still unclear if it even exists.

Note that the advantage of the above-described cloud setup, which can leverage all kinds of different processors, does not depend on the existence of quantum supremacy. Such a setup enables you to select the best tool for the job from your toolbox, which may be a GPU or even a QPU depending on what solves your problem most efficiently. Even in a world where a quantum computer would not have a theoretical advantage over an idealized, classical one, it might still turn out to be the most efficient problem-solver in real-life.

Quantum computing is a very hot topic these days. At times, it may even be overhyped due to startups overselling their abilities and a press hungry for news on breakthrough technologies. Nonetheless, I strongly believe in the transformational power of quantum computing and it is amazing to see a truly disruptive industry growing and changing the world in real-time — with or without quantum supremacy.

--

--