Is quantum computing real?

Adrian Fern
Dec 21, 2018 · 7 min read

It’s all just ones and zeros, isn’t it?

Photo by Katya Austin on Unsplash

TECHNOLOGY’S RATE OF CHANGE has increased throughout my career, and I’ve often wondered, “What if I wake up tomorrow and I can’t fathom the next big thing?” and worried I’d be deprecated overnight. Luckily, for me, I’ve always been able to rationalise this anxiety by reminding myself that, under the covers, it’s all just binary ones and zeros… Recently, however, something’s been putting me on my guard: Quantum Computing.

The concept of using quantum mechanics for computation has been around for many years and the term ‘Quantum Computer’ was first coined by Richard Feynman back in 1981. A blueprint describing how a quantum computer might work was created in 1985 and, nearly ten years later in 1994, Bell Labs mathematician Peter Shor created an algorithm that could use a quantum computer to break common forms of encryption. Unfortunately, there was nothing to run the algorithm on, and there were plenty of other things going on for us to take notice of such as Tim Berners-Lee forming the World Wide Web Consortium (W3C).

In 2007 the Canadian startup D-Wave unveiled a quantum computing chip which it claimed could solve Sudoku puzzles. After a lot of debate over whether D-Wave’s technology works, NASA and Google decided to work together and, in 2013, funded a lab to validate it. Since then, IBM has ‘open-sourced’ and connected its prototype quantum processors to the internet to enable programmers to experiment and prepare to write quantum code. Last year, start-up Rigetti Computing opened a quantum computer fabrication facility to build prototype hardware. Now, it seems after all these years, quantum computing really is the next big thing!

The quantum physicist Niels Bohr once said, “If quantum mechanics hasn’t profoundly shocked you, you haven’t understood it yet”. Well, I am shocked, and I’m not sure I’ll ever fully understand it. Google, Microsoft, Intel, and IBM are all making grand claims about developments in quantum computing, and we’re now seeing increasing news coverage, attention and investment in the field. During this year’s Ntegra US Research Tour in Silicon Valley, we spent a morning at NASA’s Ames Research Centre where their Advanced Supercomputing Facility has a D-Wave 2000Q system, with 2031 qubit in its working graph, pushing the boundaries of what’s possible. But what is it and when will it become mainstream? What will it be used for and what should I be doing to prepare?

And what about my trusted ones and zeros? In the quantum realm, apparently, they can be both — at the same time — really? It’s a long time since I studied maths and physics at school, but keen to learn more I’ve invested some time, loaded-up my Kindle, attended some talks and open lectures, done some research and started to get to grips with this strange, counter-intuitive world.

In the realm of subatomic particles, things behave very strangely and can exist in more than one state at a time. The ones and zeros I refer to in the digital world are encoded as ‘on’ or ‘off’ binary states representing binary digits (or ‘bits’ for short). Assemblies of bits are used in conventional computers to represent data, information and instructions for the machine to perform comprising a computer program. A quantum computer uses quantum bits (or ‘qubits’) which can be said to be ‘on’ and ‘off’ at the same time. More precisely, a bit can only correspond to the state of 0 or the state of 1 whereas a qubit may be in a superposition of both. The probabilities of a qubit being 0 or 1 are neither 0.0 nor 1.0, and measurements made on qubits in identical states don’t always give the same result. These properties allow groups of qubits to do much more than an equivalent number of conventional bits.

The maths of a qubit in superposition describes the probability of determining whether it is either a 0 or 1 on reading (a destructive operation that causes it to drop out of superposition). A quantum computer uses a collection of qubits in superpositions to explore different paths through a calculation. Incorrect paths cancel each other out, and the correct answer revealed when the qubits are read as ones and zeros. Representing information in qubits allows it to be processed in ways that have no equivalent in conventional computing. By taking advantage of other quantum-mechanical wonders such as tunnelling and entanglement, Einstein’s “spooky action at a distance”, two qubits in superposition can be entangled enabling operations on one to affect the other. These effects make quantum algorithms significantly more powerful than conventional ones, and quantum computers may be able to solve some problems in a few days that would take millions of years on conventional digital machines.

As weird as this all sounds, I’ve been persuaded to suspend disbelief and be content that what I perceive as traditional physical laws (from my frame of reference) no longer apply. After all, according to my wife, the way I experience the real world isn’t necessarily compatible with the way others do, so why should I think the subatomic realm should conform? If you’re a parent of teenagers, there should be a big part of your brain lighting-up right now. I’ve learned that irrational thoughts and behaviours are to be expected and, as long as I don’t observe what’s going on too closely, things seem to get along just fine. When it gets hard to accept, I say “Try putting Schrödinger in the box and see how he likes it!”. So, I’m sticking with the programme, and it’s fascinating…

Accepting these strange properties and putting them to work, we should be able to create systems that are significantly faster and more powerful than those of today. Quantum computing will increase computing power in nearly limitless ways, providing new cancer cures, developing new materials, solving complex problems we currently can’t tackle, and supporting advances in Artificial Intelligence. That said, the challenge of implementing quantum computing is extremely complex. Bohr also said that “Everything we call real is made of things that cannot be regarded as real”, eloquently summing-up the difficulties faced by organisations trying to deliver operational quantum computers today.

Back in the digital computing world, switching and memory elements (transistors) are now being etched onto silicon at scales almost as small as individual atoms, bringing us perilously close to the physical limits of conventional computers. Try to go any smaller and we leave behind the world of classical physics and end-up, not by choice, in a realm where quantum effects can lead to chaos. For example, quantum particles like electrons can randomly pass through things, even if they don’t have the kinetic energy required to break through barriers. Newtonian mechanics says this is impossible, but quantum mechanics says there’s a non-zero probability that electrons on one side of an insulator can be on the other side as well. This phenomenon is known as quantum tunnelling, and it plays havoc in conventional systems, causing corruption of data and code.

Quantum tunnelling is a Moore’s law killer, and manufacturers like Intel are using innovative new approaches, such as tri-gate or 3D transistor fabrication to keep pace (take a look into the nonplanar transistor architecture of Ivy Bridge, Haswell and Skylake processors). However, fiddling around at the edges like this won’t sustain the trajectory we’ve become accustomed to for many years and embracing quantum effects for computation may be a way to maintain Moore’s Law well into the future. In very simplistic terms, adding a single qubit doubles compute power, so we only need to add one a year to maintain current growth.

Quantum computers don’t provide an advantage in all situations, but for some problems they excel, providing exponential improvement, and their gain over a conventional computer grows geometrically with the size of the problem. Quantum computers can solve problems in far fewer steps than traditional computers need. Grover’s algorithm can find an entry in a phone book with 100 million names using just 10,000 operations. A conventional computer’s search algorithm traipsing through all the entries would require an average of 50 million steps. In this case, and many others, the bigger the initial problem space, the better for a quantum algorithm.

A current concern is that quantum computers will provide lightning fast code-breaking capabilities. Since the 1990s we’ve known that quantum computers will devour the complex mathematics underpinning encryption, the method used to transport your credit card details and other sensitive information around the internet. The development of quantum cryptography which exploits quantum mechanical properties, on the other hand, may provide new and unbreakable forms of data confidentiality, integrity and availability. However, we cannot be sure these new capabilities will come to fruition in the right order, and it’s possible that current security methods will be rendered useless long before quantum cryptography is solved.

Real quantum computing is still very much in its infancy, and in the future, quantum computers will not replace conventional computers. A quantum computer won’t be the machine on your desk at work, and there won’t be a “qPhone” in your pocket. Prototype hardware is still very embryonic and looks like something out of a 1960s science fiction movie. The apparatus requires super-cooling to temperatures approaching absolute zero, fully evacuated vacuum chambers where single atoms are cooled (brought to a near standstill by shining lasers at them) and levitated inside magnetic fields. The quantum effects that qubits govern are incredibly delicate, and environmental interactions from heat or noise can flip ones and zeros, or cause particles to crash out of superposition. Therefore, quantum computing implementations depend heavily on a quantum processor’s ability to auto-correct errors.

There is, undoubtedly, enormous potential in quantum computing and as the enabling hardware advances, I’m sure we’ll see more and more use cases and applications. The quantum adoption S-Curve is yet to take-off and, knowing a little about the complexities of persuading the sub-atomic world to play nicely for us, I am reluctant to make predictions about when this will become mainstream. Having run experiments on the IBM Q5 Tenerife quantum computer, I’m ready to pour on rocket fuel.

If you have enjoyed this article please take the time to applaud it.

Adrian Fern

Written by

Founder & CTO @Prizsm_UK; Founder & Technology Consultant @Fern_ICT

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade