Pavlos Apostolidis
Science, simply explained
4 min readFeb 23, 2018

--

Quantum Computing in a Nutshell

The Basics

The first half of the 20th century established the fundamentals of quantum mechanics, that revolutionised the way scientists see the microscopic world. The technological advances that followed quantum theory allowed for its use in electronic devices, including microprocessors, solar cells, medical imaging equipment and many others. Quantum technologies and devices are now an active research area, aiming for future technological developments. One of the most fascinating candidates for such upcoming technologies is the quantum computer. A quantum computer is theorised to be able to take advantage of quantum effects to allow for exponentially faster calculations, when it comes to specific operations, such as factorising large numbers, solving complex problems (the traveling salesman problem is a popular example), encryption and others.

Moore’s law roughly states that the number of transistors or the computing capacity of new computer chips doubles every 18 months or so. This implies an increase in processing speed and a size reduction of the microchip features. While this law was held true up to now, the semiconductor industry is approaching a fundamental limit. Microdevices are getting too small, reaching the point where quantum effects, like quantum tunnelling*, will start to dominate and set transistors unreliable and potentially useless. However, quantum computing revolves around the idea of using quantum effects as another way of executing specific computation tasks that have been optimised to run on such a computational architecture. A classical bit can only possess a well-defined state of 0 or 1, while a quantum bit (qubit) can exist in a superposition of both. This leads to tremendous computational advantages when combined with carefully engineered algorithms. Such algorithms have been developed by Shor (integer factorisation, reduction from roughly O(exp(N)) to O(log(N)), Deutsch (unstructured database search, reduction from O(N) to O(N/2)) and others. It has been shown that their algorithms can fundamentally outperform any conventional algorithm that performs the same task.

The ultimate goal is to develop a universal quantum computer that could be used for a variety of tasks. The main obstacle in doing so is probably “hardware” limitations. Those include the fact that a quantum processor must be well-isolated from its surroundings but at the same time maintain the ability to interact with the user. Moreover, scalability is an issue, as using lots of qubits on one chip is usually non-trivial. In addition, there is the fact that most solid-state quantum computation implementations require ultra-low temperatures to operate; not readily available to everyone. Finally, it is probably worth mentioning that quantum computers are unlikely to replace conventional computers — in other words you won’t be buying quantum laptops or smartphones anytime soon. The main usage will probably be at large organisations, cloud services, universities and research centres.

*In simple words, quantum tunnelling refers to the ability of electrons to escape (tunnel) to a region they are not supposed to (classically they do not have enough energy to do so).

The technical

In theory, any two-level quantum system can be a qubit. Having N qubits allows for 2^N individual states that can be available, diminishing the classical computation case where the number of bits and states coincide. The general idea of quantum computing is to utilise the fact that any quantum state is governed by a probability amplitude, and choreograph an interference pattern where paths leading to the correct answer interfere constructively, reinforcing each other. Similarly, paths leading to wrong answers would interfere destructively and cancel out. Therefore, the procedure would yield the correct answer. On top of everything, regardless of the implementation, any universal quantum computer must allow for the following criteria (DiVincenzo’s criteria):

1. The system must be scalable and have well defined qubits.

2. The system must allow for the qubits to be initialised in a starting state (i.e. there must be a “reset” operation).

3. The qubit coherence times* must be long enough for operations to take place (i.e. the qubits must be “usable” for long enough after resetting).

4. There must exist a universal set of quantum gates (these are like conventional gates, for example AND, OR, CNOT, etc).

5. There must be a read-out capability (at the end of the computation, the qubit state, i.e. the answer, must be read and given to the user).

A solid-state quantum processor is usually hosted in a dilution refrigerator, where it is cooled down to thousandths of a degree above absolute zero. This is a requirement for qubits to be usable as the thermal energy is taken away and thus thermal fluctuations are minimised. A dilution refrigerator uses a mixture of helium-3 and helium-4 to cool the processor to temperatures below 10 mK. For comparison, note that the surface of the sun is about 20 times hotter than room temperature, and room temperature is about 30 000 times hotter than 10 mK.

*The coherence time of a qubit denotes the time during which the qubit has a well-defined (i.e. known) phase.

The interior of a dilution refrigerator.
A PCB with quantum nanodevices used to initialise, manipulate and read-out qubits.

--

--