DataSeries
Published in

DataSeries

Google demonstrates impractical improvement in quantum error correction — but it does work

Scale would need to be cranked way up to have an impact, however

Google has demonstrated a significant step forward in the error correction in quantum computing — although the method described in a paper this week remains some way off a practical application.

In December 2019, Google claimed quantum supremacy when its 54-qubits processor Sycamore completed a task in 200 seconds that the search giant said would take a classical computer 10,000 years to finish. The claim was then hotly contested by IBM, but that is another story.

A qubit is the quantum equivalent to a conventional computing bit. Each qubit can be 0 and 1, as in classical computing, but can also be in a state where it is both 0 and 1 at the same time. That mixed state is known as a “superposition”. In theory, as you add qubits, the power of your quantum computer grows exponentially, increasing by 2n, where n is the number of qubits.

Now, in practical terms, it is difficult to overstate exactly how much heavy lifting the words “in theory” are doing in that last sentence.

Qubits are notoriously unstable, and susceptible to the slightest environmental interference, but understanding how much error that instability introduces is also difficult. Conventional computers are also prone to errors, but account for them by making copies of bits and performing a comparison.

Looking inside a qubit is impossible, as pioneer of quantum mechanics Erwin Schrödinger famously imagined when trying to assess the true health of a cat when randomly subjected to a life-threatening quantum event inside a box.

Google’s approach to the problem is to create a parallel set of qubits “entangled” with the qubits performing the calculation exploiting one of the other strange phenomena of quantum mechanics.

Although arrays of physical qubits have been used to represent a single, “logical qubit” before, this is the first time they have been used to calculate errors. In the Chocolate Factory’s setup, five to 21 physical qubits were used to represent a logical qubit and, with some post-hoc classical computing, it found that error rates fell exponentially for each additional physical qubit, according to a paper published in Nature this week. It was also able to demonstrate the error suppression was stable over 50 rounds of correction.

So far, so good, but the experiment by Julian Kelly, Google research scientist, and his team was a demonstration of a method that could one day be used to create a good system for error correction in quantum computing. It is not yet an effective system for error correction itself.

One problem is scale, explained Martin Reynolds, Gartner distinguished vice president. The paper suggests a practical quantum computer might need 1,000 to 10,000 error-correction qubits for each logical qubit.

“You can see that the scale isn’t there, but the fact that they’re doing it at all demonstrates that it works,” he told The Register.

Meanwhile, researchers would need to improve the quality of qubit stability to get towards a workable machine.

“They are working on really poor quality qubits. These early qubits just aren’t good enough, they have to get at least 10 times better in terms of their noise and stability, even to do error correction of the kind that we’re going to need. But just to have this piece of the puzzle in place is a really good sign,” Reynolds said.

Kuan Yen Tan, CTO co-founder at quantum computing firm IQM, told us: “What Google did was to show that this one method of error correction and detection is very suitable for the topology that they have in their system. It’s a very important milestone to show that the proof of principle works. Now, you just need to scale it up, and scaling is a very big challenge: it’s something that’s not trivial: you still need thousands if not millions of qubit to be able to do error correction and detection. That’s still a really huge technological gap that you have to overcome.”

But these are not the only challenges that remain. Google’s approach to error correction uses classical computers to spot likely errors using data from the physical qubits after its quantum processor has run the algorithms.

The next step is doing error correction on the fly. Kuan said Google’s experiment relied on a set of classical controls when detecting errors, which takes “a really, really long time.”

“Then you have to go back to the qubit and say, OK, now we have to correct the error by that time the error is something else already. I believe that is the bottleneck at the moment for the experiment,” he said.

Still, Google’s authors argue, in a peer-reviewed study, that their results suggest that quantum error correction can be successful in keeping errors under control. Although the approach is not yet at the threshold of error rates needed to realise the potential of quantum computing, the study indicates that the architecture of Google’s Sycamore processor may be close to achieving this threshold, the researchers said. ®

This story has officially been written by Lindsay Clark and published with The Register. Kuan Yen Tan, CTO of IQM, an OpenOcean portfolio company has given some great comments.

--

--

--

Imagine the future of data

Recommended from Medium

Caption Clubhouse in Real Time

Buying Your First Coffee Machine

Best Android Emulator for Games Created in 2018?

A Beginner’s Guide to Buying Your First Camera

On building good things

Is the Metaverse the Future of Fitness?

Your M̶o̶n̶e̶y̶ ̶C̶u̶s̶t̶o̶m̶ Call Is Important To Us

I Tried Wood Working and Everything Went Horribly Wrong in Vrkshop VR!

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
DataSeries

DataSeries

Follow our publication (https://medium.com/dataseries) 💡Connecting data leaders and curating their thoughts

More from Medium

How Animals-Research Can Alter AI and IoT-Research

How to set up a Google Compute Engine using Ubuntu 20.04

Using language models to prove truths about reality

Let’s get edgy with Coral Edge TPU

Coral reef