Quantum Computing Basics — A simple explanation

Gayatri Mohan
Dialogue & Discourse
5 min readApr 21, 2019

--

Quantum Computing leverages the physical phenomena and the mathematical underpinnings of quantum mechanics to create a radically new computing paradigm that can compute better and solve problems and questions that are challenging for classical computers. Recent research and progress in commercialization indicate that quantum computing will be a key component of our future technological stack with the potential to disrupt many industries and businesses.

As we know, classical computers rely on the ‘bit’ as the primary carrier of information. Each bit can be in one of 2 states — 0 or 1 — and classical computers use binary logic on groups of bits (think logic gates as examples) to perform computations and provide deterministic answers to problems. Quantum computers are explicitly designed to work using the principles of quantum mechanics, the mathematical formulation of the laws of quantum systems. The basic carrier of information is ‘qubit’ (atoms, ions, photons, etc.), each of which can be in multiple states, rather than just two in the case of classical bits. This along with other key concepts of quantum mechanics — superposition, entanglement and probabilistic estimations — make quantum computing radically different from classical computing.

We will see how these differences enable the Quantum Computer to solving a complex problem that are beyond the reach of classical computers.

Classical computing, with its basis in binary logic, has been used very successfully to solve a large variety of computational problems. Even as we speak, great progress is being made in problems of a scale and complexity that seemed impossible even just a few decades ago. Yet, classical computing does have its limits. There are many complex problems that are unsolved, and indeed may be unsolvable if we rely only on binary-logic-based computing.

Computational Power

The processing power of a classical computer is limited by the number of transistors built into their silicon chips. Some of the impressive achievements of classical computing can be attributed to technological progress in squeezing more and more transistors on to smaller chips. Following Moore’s law — the number of transistors on a chip doubles every year and costs are halved — today roughly 4.3 billion transistors can be made to sit on a chip the size of a fingernail. The size of each transistor is about 14 nanometers, just 30 times the size of an atom (0.1 to 0.5 nanometers). Making transistors any smaller will mean that they are nearly the size of an atom, and quantum mechanical effects will appear, whether we plan for it or not.

To be fair, classical computers do use electrons in silicon chips, but they are used only for low-level implementation. Binary logic was still the paradigm and quantum effects were not part of the computational processes.

Computational Complexity

Several decades ago, researchers such as Richard Feynman had already recognized that some quantum phenomena could not be simulated by a (classical) Turing machine. In other words, no classical computer would be able to simulate these quantum phenomena in polynomial time. This realization led to a hypothesis that such phenomena need to be solved through a different kind of machine — a Quantum Computer. Quantum computers possess very high computational power that classical computers don’t.

Quantum Computing Basics

Qubits

A qubit is an elementary information carrier in a quantum computer. It is the quantum equivalent to a classical bit but unlike the classical computer that can take a value either 0 or 1 only at any time, a qubit is in a superposition of states denoted by the state vectors|0⟩ and |1⟩ , its pure state being a linear combination of the two, denoted by α|0⟩ + β|1⟩where α and β are complex values.

Superposition

Superposition denotes the probability of a quantum bit’s state. Instead of holding the value of 0 or 1 as in classical bit, the probability of the state value 0 or 1 is held. Measuring the qubit breaks down the superposition into a single state.

Entanglement

Another key phenomenon is quantum entanglement. Here, once a pair (or a group) of particles occupy the same state space, their future states carry the impressions of this interaction. So, describing one particle independent of the other(s) is not possible whatever the distance between the two/group of particles may be at a future time.

Probabilistic Estimation

Classical computers are inherently deterministic — they yield the same result every time. But quantum computers do not behave in a similar fashion. Measuring the qubit itself is a probabilistic estimation as qubits are in a state of superposition. Also the decoherence is one of the challenges that collapse the superpositioned qubit to a classical state that cannot be pre-determined. As a result, same results are not recorded every time — one receives multiple answers once for each trial. Hence the same experiment is tried several times and the most probable outcome is considered as the solution.

Particularly with optimization problems, a definitive correct solution doesn’t exist and hence an optimized solution though not the most perfect one, is also accepted.

Where are we

One of the challenges that lie ahead of us in the field of Quantum computing is the ability to have systems that are fault tolerant. Decoherence is a major drawback to the quantum systems as qubits collapse into a classical state after a very small time period. Quantum Error Correction mechanisms are being created to help build systems such that the qubit information is encoded in multiple physical qubits. Till Quantum Supremacy evolves we will live in an era of NISQ — Noisy intermediate-scale quantum systems

(to be continued…)

--

--

Gayatri Mohan
Dialogue & Discourse

Data and Solution Architect …..with a passion for the Quantum world