Quantum Supremacy — Google Sycamore Processor

Jonathan Hui
Oct 24 · 17 min read
Sycamore Processor (Source)

Google has made a major claim in quantum supremacy with the new Google quantum computer using the Sycamore Processor. In this article, we will look deeper into quantum supremacy, its significance and the Sycamore Processor itself. The holy grail of quantum computing is to demonstrate speed supremacy over the classical computer — the computer we have today. The complexity of the RSA cryptographic algorithm was thought to be exponentially grown with the length of the key. This algorithm safeguards our secure Internet transmission for decades. It was once using a 1024-bit long RSA key but extended to a 2048-bit RSA key in neutralizing faster computers. Since the complexity grows exponentially with every bit, we will be safe for a long while. In 1994, Shor demonstrated a quantum algorithm that cracks the RSA algorithm in polynomial time. Nevertheless, this remains theoretical since we don’t have a quantum computer to make it happens yet. This is why the Sycamore Processor generates so much attention. Google claims it achieves quantum supremacy with its new processor. Fortunately, it has only 53 qubits which the Shor’s Algorithm may take millions of qubits to crack the 2048-bit key!

For those new to Quantum computing, allow us to do a quick introduction. But for those that don’t need this introduction, feel free to skip the optional sections.

Why is Quantum Computer supreme? (Optional)

A “bit” in the classical computer holds either 0 or 1. Many magazines contribute the quantum breakthrough on how a qubit (a quantum bit) can hold more information than a binary bit. This is very misleading.

Let’s do some simple math. Say a qubit can hold 1 billion more information than a binary bit, a 53-qubit machine is equivalent to a 6.6 GB classical computer. I purchased a 16 GB laptop 5 years ago! So this cannot be the quantum supremacy. So what is the reason?

Let’s imagine we have a 3-bit classical computer. A variable, say “a”, can be in one of the 8 states. We can reformulate the assignment a=6 in an odd way below.

In this equation, we have 2³ coefficients. But for our classical computer, only one coefficient is one and all others are zero.

Classical computer is a single-issue voter. It takes only one single position at a time.

Quantum mechanics claims that a system can be in a superposition state. If you hear about Schrödinger’s cat thought experiment, it explains that a system can be in different states at the same time. The cat can be alive and dead simultaneously. I don’t like to explain quantum mechanics this way. But mathematically, for a quantum computer, the coefficient of a 3-qubits system does not need to have exactly one value of 1 with others to be zero.

Indeed, all 8 coefficients can have any real numbers or even complex numbers. Classical computers are like a light switch that either has the light on or off. Its state is in one of the eight states only. For quantum mechanics, it does not need to be in a particular state until we measure it (observing its value).

Nature only picks a side on its state when we measure it.

So a superposition can be represented mathematically with non-zero coefficients. This is the rule of nature. Look bizarre from our experience but it is true because many possible effects are regularly masked out (cancel out) on the human scale.

In addition, don’t underestimate the power of exponential! A 64-qubit system contains

coefficients. The biggest US lottery jackpot was 1.6 billion dollars. 2⁶⁴ is about winning the biggest lottery 30 billion times. With 64-qubits, we can encode and manipulate a lot of data using these billion-billion coefficients. But quantum computers do not have traditional arithmetic or logical operators. In fact, it has no “if” loop or “while” loop structure. That is why you cannot recompile a C++ program and run it on a quantum computer.

We apply operations in manipulating the superposition equation before making a final measurement. So what operators quantum computers have. Quantum computers apply quantum gates to manipulate the superposition state. Conceptually, a quantum gate acts like matrix multiplication on a vector (the superposition state) in linear algebra. This is all high-school math (at least in the beginning)!

In quantum mechanics, the Schrödinger equation describes how states evolve. It is a linear system. Nature is governed by quantum mechanics. That is why Richard Feynman asked

What kind of computer are we going to use to simulate physics?

For Feynman, quantum computers are the most natural way. With careful design of the basis of a system, many non-Physics problems can be solved through linear algebra. In classical computers, this vector contains exactly one value of 1 with zero otherwise. i.e. it is in one state at a time only. This limits the expressiveness of the tool. For quantum computers, we can manipulate billions of information simultaneously with a single quantum operation.

Quantum Entanglement (optional)

But this paints a pretty naive picture. The real answer for quantum supremacy remains a mystery. While we develop new quantum algorithms, we do sometimes find corresponding classical algorithms that match the performance of quantum algorithms. That’s one reason why finding quantum supremacy is very hard and why there is so much attention on Google’s quantum supremacy claim. What we may know is that a quantum algorithm cannot be supreme than the classical method if it does not utilize quantum entanglements. The parallel manipulation on a superposition is the ingredient while quantum entanglements are the secret sauce. However, we don’t know the exact reason.

If two apples cost $2, each apple costs roughly $1. In classical systems, we believe that if we know every detail of the state of a system, we know every states of its subcomponents. That is not true in nature. In nature, we can prepare states that are correlated but not with specific measured states. For example, in a system composed of 2 qubits, we can create states with a 2-qubit quantum gate that both qubits will have the same measured value (or create states that are always opposite). This is as easy as creating |0 0⟩ or |11⟩. However, given two separate qubits prepared separately with any supposition values, we cannot concatenate (put) two qubits together to reproduce the entangled state.

In a nutshell, we can prepare a composite superposition which will be measured as either |00⟩ or |11⟩. But we cannot simply concatenate two separate qubits together without applying a 2-qubit operator.

In some perspective, we create a correlation pattern for the data. In classical computers, we may have to encode such information as two separate scenarios: “00” or “11”. For a k-qubit system, such a correlation pattern can be extremely complex. It can have different combinations of qubits to be the same or to be the opposite. But yet, we don’t know enough to rephase an algorithm into an entanglement problem.

Theoretical Challenges (optional)

In general, quantum computing is not more supreme over classical computing. Both will co-exist. There will be plenty of algorithms that have no compelling advantages — both cannot escape the curse of the exponential. Or others will be just not cost-effective.

There are a couple of theoretical challenges in quantum computing. First, the complexity in loading values for the coefficients grows exponentially. Unless we identify data patterns in the data to load them polynomially, we are still under the curse of exponential. To find one quantum algorithm that the classical algorithm cannot match, it is even much harder. Second, we cannot directly probe the coefficient values. We find the state by measurement but the superposition will collapse. When superposition is measured, nature takes one of the measured states with probability equals to the square of the corresponding coefficient.

For example, if the wanted answer is|010⟩, we manipulate the system to push the coefficient for |010⟩ very high. We repeat the calculation many times and sample the solution. Then, we should find that many measurements land in |010⟩.

There are other possibilities in finding a solution by finding patterns in the coefficients. But the mentioned constraints make developing supreme quantum algorithms very hard and non-obvious. In fact, we don’t have a zoo of quantum algorithms yet even with 30 years of research. But we may only need a few of them to solve some critical problems.

Google Sycamore Processor

Now, we are ready in talking about Google Sycamore Processor. It is a 53-qubit processor (one qubit out of the 54-qubits is not working). The design is based on superconducting circuits. The challenge of quantum supremacy is solving a problem far more efficient with quantum computers than classical computers. Typically, a supreme quantum algorithm would break the curse of exponential complexity into polynomial while the classical algorithm cannot. In the Google paper, it produces a sequence of pseudo-random generated streams in 200s while it is estimated to take 10,000 years with a classical computer to simulate the same result. Actually, it is not just a typical classical computer, but the current fastest supercomputer. The current IBM quantum computer (as of 10/2019) has 53 qubits. There are other quantum processors that have more qubits. So there are two possible reasons for the Google claim:

  • the choice of the algorithm in demonstrating the supremacy or
  • the improvement in the quantum gate fidelity.

In fact, IBM has responded with claims that the chosen algorithm can be solved by an “idea” simulation with classical computers in 2.5 days only. The reward and the competition is fierce, so does the counter-arguments. Google’s John Martini has responded with

We’re looking forward to when people actually run the idea on Summit and check it and check our data because that’s part of the scientific process — not just proposing it but actually running it and checking it.

Summit is currently the fastest supercomputer in 2019, located in U.S. Oak Ridge National Laboratory.

Source

But don’t let it spoil the fun. In particular, the improvement of the 2-qubit gate fidelity seems to be a critical reason for this progress. Therefore, I will focus on Google’s quantum processor here.

As recalled, Shor’s algorithm may require millions of qubits to hack the 2048-bit key. Some people may say that it can be solved with thousands of qubits. The main discrepancies are related to fault tolerance. In engineering school, we rarely talk about this topic anymore.

But matters interact.

The quantum information in qubits is very delicate. Once, we start a quantum program, we cannot pause it and take a coffee break. The quantum information degrades continuously. There are two major designs in quantum computers: superconducting circuit & trapped ions. Both are susceptible to degradation, but the superconducting circuit seems to be worse. But it usually has faster gates. Usually, in the current superconducting circuit, the coherence of the qubits can be maintained in tens or hundreds of μs range. i.e. the quantum information will be degraded to noise. Due to the random nature of quantum mechanics, the fault tolerance topic comes back. To improve the fidelity, we can build fault tolerance circuitry to correct errors. This is why it is estimated that it takes millions of qubits to break the current RSA keys. But 53 qubits are barely enough for anything, so the improvement would be likely on the gate fidelity. In the superconducting circuit, it builds artificial quantum systems in the form of circuitry. Therefore, no two qubits will be identical. Quantum computers often require recalibration daily. In particular, calibration is important since we are not dealing with a digital system anymore. Google introduces a method to calibrate and benchmark the processor using the cross-entropy benchmarking. We will discuss that later.

Qubits

The basic building blocks for many electrical circuits include capacitors and inductors. Below is a resonator using a capacitor and an inductor.

Source

The qubit in a superconducting quantum computer is actually a resonator circuit with a capacitor and an inductor. But we need two quantum states that are wide apart in the quantum scale and can be manipulated easily. Google Sycamore processor utilizes the two lower quantum states in the resonator circuit.

Source

But we don’t want the energy difference between states to be uniform. Otherwise, the energy required to raise from the state |0⟩ to |1⟩ can mistakenly raise a state from |1⟩ to |2⟩ also. The desired nonlinear energy levels are created using a Josephson junction as an inductor.

A Josephson junction is made of a nonsuperconducting material between two layers of superconducting electrodes. A Cooper pair of electrons can tunnel through the nonsuperconducting barrier from one superconductor to another.

Source: IBM

Energy dissipation degrades quantum information. That is why the chip is cooled down close to absolute zero (20 mK) in a dilution refrigerator. This creates Cooper pairs in the superconductor that have no resistance.

Source

It also reduces ambient thermal energy so it does not have enough energy for the qubit to jump to the excited states. Don’t worry too much about Cooper pairs and Josephson junction. Scientists received Nobel prizes for that. Below is an old 9-qubit processor.

A 9-qubit processor. Source (UCSB & Google)

The red cylinder below is an opened dilution refrigerator chamber. Inside include cores to cool down a quantum processor at the bottom as well as cables to control, and to amplify and to readout qubits.

Source: Reuters

The quantum computer is huge but the qubits are small. This is the packaged Sycamore processor that is shielded from the electromagnetic environment.

Source

Superconducting quantum computer suffers a problem that is more serious than the trapped ion. In classical computers, we can randomly load RAM memory into the CPU registers. And we can apply operations to any two pairs of registers op(r₁, r₂). But in superconducting quantum computers, operators can be applied to neighboring qubits only. So op(q₁, q₂) works in the Sycamore processor if qubit q₂ is one of the four neighbors of q₁ in a rectangular lattice. The blue rectangles below are the coupler that applies 2-qubit operations between two qubits.

Source

Both IBM and Google quantum processors use transmon qubit. It is close related to the charge qubit. Transmon prolongs the T₂ Dephasing (the degrading of the superposition). In some old benchmark, it improves from 1–2 μs in charge qubit to 100 μs. A charge qubit has basis states represent the presence or absence of excess Cooper pairs on the island (shown with the dotted line below). The state is determined by the number of Cooper pairs that have tunneled across the junction.

Source

A single-qubit quantum gate in the Sycamore Processor is executed by driving 25-ns microwave pulses resonant with the qubit frequency while the qubit–qubit coupling is turned off. The two-qubit entangling gates are done by turning the neighboring qubits on-resonance and turning on a 20-MHz coupling for 12 ns. This allows the qubits to swap excitations. In superconducting quantum computers, quantum gates are implemented with microwave frequencies with different durations.

Task

For anyone who breaks the current RSA encryption first, they can claim the quantum supremacy for sure. But we are still far away from there. The choice of tasks in demonstrating supremacy remains debatable. Previously, we can show it with toy algorithms using an Oracle. This is quite unsatisfactory because these Oracles usually show no value in solving real problems. But even for those problems, no quantum computer has shown supremacy yet. Regardless of other claims, Google’s processor is a significant milestone because it demonstrates a problem with some real value. Whether classical computers will take 100,000 years or 2.5 days for the pseudo-random generator, this kind of speed improvement is rarely or never demonstrated with a general-purpose quantum computer.

Random generators are often referred to as pseudo-random generators. It turns out random number generation is hard. We usually approximate the process. But random generators are important for fields like cryptography, gambling or simulation of physical systems. The German Nazi’s Enigma machine was hacked due to messages are ended with “Heil Hitler”. This creates a pattern that can be hacked.

The task chosen by Google’s teams is a pseudo-random quantum circuit. It implements a quantum circuit of depth 20 with 430 2-qubit and 1113 1-qubit gates. The circuit entangles a set of qubits by repeated application of the quantum gates. Each run of this random quantum circuit produces a bitstring like 0100110. Google repeats the run a million times to produce the random bitstreams. They are pseudo-random because quantum interference will make certain bitstrings to be more likely. This is the same principle behind the double slot experiment.

Source

This algorithm is particularly chosen because the randomness inherited in quantum mechanics is extremely hard for classical computers to model. The chaotic behavior is hard for classical algorithm to exploit data patterns. Instead, it needs to track an exponential amount of state vectors. In short, Google picks a specific fight that most likely to win. However, it unlikely achieves the performance and cost ratio that makes this circuit practical for commercial purposes.

To create a classical version with the same probability distribution of the bitstream output (i.e. generator with the same randomness/pattern of bitstreams), it usually requires a direct numerical simulation of the quantum circuit. But the simulation complexity grows exponential in the number of qubits and the depth of the circuit. Classical computers can simulate a generator circuit with about 40 qubits. Beyond 50 qubits, it really pushes the limit of classical computers because of the exponential state vectors that it needs to track. For IBM, it claims that if more advanced optimizations are used with disk space as a fall back for the RAM, the classical computers can theoretically solve it in 2.5 days. For this task, Google pushes the limit on the states to track in classical computers to prove the quantum supremacy. This is the strategy in identify problems where classical algorithms cannot escape the curse of exponential. But to achieve the supremacy, the gate error rate must be low since accumulated errors kill in quantum computing.

The gate sequence for the Google pseudo-random quantum circuit generator is shown below:

Source

In the experiment, the Jülich supercomputer (with 100,000 cores, 250 terabytes) is used to fully simulate the quantum circuits up to 43 qubits. Beyond that, it runs out of memory to track all the state vectors. Beyond that, the computation cost in the classical computer is calculated by the Summit supercomputer (current most powerful supercomputer) to extrapolate the result. In the red zone below, the classical computer cannot perform the simulation effectively anymore. Portions of the quantum circuit will be run and the cost to simulate it with the classical computers will be extrapolated by the supercomputer.

Source

Here is the circuit depths & qubits combinations where quantum supremacy will be achieved.

Source

Cross-entropy benchmarking

Calibration is a critical part of the quantum computer operations. The cross-entropy benchmarking is a metric that related to the probability distribution of the bitstring produces by the quantum circuit and the theoretical one computed by the simulation on a classical computer. We should not worry too much about the definition below.

A value of 1 indicates an error-free circuit while 0 means it is bad. This metric can be used to calibrate and to benchmark a quantum computer. In the quantum supremacy experiment, it wants to achieve as high F value as possible for a very deep circuit with many qubits.

As shown in the left diagram below, the F value drops in the benchmarking as the number of qubits increases. On the right, it estimates the simulation time on classical computers as the depth of the circuit increased. So with a depth of 20, Google estimates that it will take 10,000 years.

Source

On each qubit, the benchmarking applies a variable number of 1-qubit quantum gates to measure F. In addition, it executes 1-qubit gates to all qubits simultaneously. The error difference between these two experiments is small so it indicates the Google processor has low microwave crosstalk (which measures the negative impact of one operation on other qubits).

In the calibration, each qubit is biased to a slightly different frequency. For example, the diagram below shows the different readout frequency (read/measure the value of the qubit) for different qubits.

Source

Here is the direct quote from Google in explaining their achievement.

The success of the quantum supremacy experiment was due to our improved two-qubit gates with enhanced parallelism that reliably achieve record performance, even when operating many gates simultaneously. We achieved this performance using a new type of control knob that is able to turn off interactions between neighboring qubits. This greatly reduces the errors in such a multi-connected qubit system. We made further performance gains by optimizing the chip design to lower crosstalk, and by developing new control calibrations that avoid qubit defects.

In short, by reducing the gate error and crosstalks in parallel gate operations, Google can build a random quantum circuit that classical computers cannot simulate.

As a reference, this is the error rate for different operations.

Source

Thoughts

To move forward, we need to improve the number of qubits into the million range as well as the fidelity of the qubits and the quantum gates. This will be extremely challenging. Developing quantum algorithms is hard and writing quantum coding is tough. It is a totally different paradigm and the software eco-system is in the early infant stage.

But the rewards can be amazing. Quantum systems are too hard to simulate classically. This is the vision of Feynman. Google Sycamore Processor focuses on the fidelity. The task demonstrating the quantum supremacy is carefully selected by Google. We will not see any significant applications as the performance and cost ratio cannot justify such deployment. The interesting applications will be in quantum simulation for physical and molecular interactions. It will be years, if not decades, to come. Solving optimization problems with quantum computing is also heavily studied. Specialized quantum computers are built to address them. And AI mostly involves optimization problems.

So is Google’s claim significant? We know quantum computing can be supreme for some problems. But no one has demonstrated it with a general quantum computer. This is until the Sycamore processors with its first demonstration. The choice of the problem may not seem too satisfactory from some perspective comparing with what classical computers can solve. But every parent will be amazed at the first word or the first walk from their baby. Many companies made many important milestones in quantum computing. This is definitely one of them.

We have a couple of articles detailing quantum computing as well as the superconducting quantum computer. Feel free to check it out.

If you want more technical details, the supplementary information of the research paper in the reference section is definitely a good start.

Reference and Credits

Quantum supremacy using a programmable superconducting processor

Quantum supremacy using a programmable superconducting processor (Supplementary information)

Characterizing Quantum Supremacy in Near-Term Devices

On “Quantum Supremacy”

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade