7 Designs for a Universal Quantum Computer
Finding the prime factors of a 2048-bit number would take a classical computer millions of years, but a quantum computer could do it in just minutes. That is because a quantum computer is built on qubits, which take advantage of quantum superposition to reduce the number of steps required to complete the computation. But how do you actually make a qubit in practice and how do you read and write information on it?
Not all qubits are born equal. In an exciting quest to reach the Holy Grail of a fault-tolerant system, a ton of research is being done in every corner of the globe. An avalanche of webinars, online conferences, articles, and hundreds of papers are being published every week. At some point one of the qubit types will stand out but for now, it is still unclear which one.
This is an overview of the physical structure of every design out there, with links to the companies that build them. Some of these are prototypes sitting in labs, and we only have general statements about their intricacies. To be as inclusive as possible I have described the minimum common denominator to all designs on each category, highlighting exceptions when needed. Let’s dive in.
This is the most popular technology and where many corporations are pouring their people and material efforts. These machines are powered by a superconducting qubit-based processor. Qubits are coupled to a linear superconducting resonator for readout. The combination of the qubit, the linear readout resonator, and the associated wiring provides a quantum circuit capable of reliably encoding, manipulating, and reading out quantum information. Single and multi-qubit logic operations are implemented through the application of microwave pulses.
This technology requires extreme isolation from the environment, as any external agent interacting with the system can cause the decoherence of the entangled sub-particles. For that reason and also because superconductivity only can be achieved under extremely low temperatures, several layers of casings envelop the machine. These cans nest inside each other and act as thermal shields, keeping everything super-cold and vacuum-sealed inside. A series of gold plates separate cooling zones, built on a columnar architecture, which at the very bottom plunge to one-hundredth of a Kelvin (-459.7°F / -273.1°C). Beneath the heat exchangers sits the “mixing chamber”, inside which different forms of helium separate and evaporate, diffusing the heat. Cryogenic cooling of the processor itself is absolutely vital because the electronic circuits operate with frequencies in the microwave range, which at room temperature are flooded with particles that overwhelm the delicate qubits with random noise.
Finally, the QPU (quantum processing unit) features a gold-plated copper piece with a silicon chip inside hosting the machine’s brain. A series of cables deliver signals to and from the chip to drive qubit operations and return the measured results.
The fun fact about this colossal piece of engineering is that the quantum processor itself occupy only a tiny fraction of the room at the bottom of these refrigerators. Yet photos of the surrounding electrical wiring and cooling technology itself have become the stock images used to represent quantum computing.
Linear Optics Quantum Computation (LOQC) uses photons, or single particles of light, as information carriers. These light-based quantum computers are based on a reconfigurable silicon quantum chip that can process photons in a stable, accurate and fast way. The job of the chip is to take a series of laser pulses (used as a source of light) and turn them into a programmable entangled state that can be used to encode and process information.
A very appealing feature of this technology is that initially operates at room temperature, easily integrating into existing telecommunication infrastructure, enabling a system that can be hosted in a standard server rack and installed in regular data centers. On top of that, well-known silicon-based materials can be used to build the processor, so we can leverage the existing knowledge of an industry that has already poured billions on a technology that works and it is used to build classic devices. Like any other quantum architecture nowadays, fault-tolerance is still out of reach, however these qubits are one of the most error-resistant, and offer great flexibility in designing error-correcting codes. This makes optical qubits a very promising bet for the future.
There is however an important caveat. Saying that “photonic quantum computers operate at room temperature”, needs qualification: the chip itself sits at room temperature, but refrigeration (less than 1 degree above absolute zero) is currently needed to read the photons. The specialized counting detectors are part of the hardware stack, and while they do not require cryogenics, they do rely on superconductivity to do their job therefore require very low temperatures. This introduces an electrical/optical hybrid system that provides a higher level of performance, but with the need to convert from photons to electrons and back. As this is a rapidly evolving field with several players making progress (even though they not always want to make it public), future detectors may not require cooling at all.
Another exciting field of development that aims to overcome the scaling issue revolves around the use of a “quantum memory” to store and retrieve single photons on demand. This enables a serial ‘repeat-until-successful’ operation, rather than relying on a large number of separate components working in parallel. This buffering of operations overcomes the randomness that is inherent in photonic quantum operations without the need for redundancy in components.
Neutral atom qubits
This technique uses lasers and magnets to cool single particles to a temperature of nearly absolute zero, and control the atoms with a high degree of precision. Because neutral atoms don’t interact easily they are more immune to noise and can hold onto quantum information for a relatively long time.
The machine is powered by a set of atoms (generally rubidium, but some use helium) that reside in a glass cell the size of a matchbox, on top of which the chip is placed. The atoms connect to their nearest neighbors and are held by a laser beam into a 2D or 3D array. More lasers — six beams per atom — slow the atoms until they are nearly motionless. This part is pretty amazing, to cool the atoms the laser beams are applying a force of 1,000 times the force of gravity. Then, with yet another set of lasers, the atoms are coaxed to interact with each other, set their initial values, create and connect the gates and perform calculations.
There are a few advantages on this design: atoms are naturally identical, so are the qubits made with them. Also the whole architecture does not require cryogenics, being only the atoms cooled to nearly absolute zero, which simplifies the scaling process.
If there is no need of massive cooling devices, noise is low, and scaling is easier, one might expect this to be the technology that everyone else will be adopting. I believe the reason why we are not hearing more record-breaking numbers and practical use cases on neutral atoms is because they are coming later into the game. You can find articles as recent as 3 years ago describing ongoing experimentation with more massive molecules, until this approach seems to be dropped later in favor of the current single atoms.
The level of control that has been achieved at the single particle level within these arrays of optical traps, while preserving the fundamental properties of quantum matter (coherence, entanglement, superposition), makes this technology a prime candidate to disrupt the industry, and yet only a few startups seem to be focusing their strategy on it.
Qubits are made of ionized atoms (reportedly ytterbium, a very rare metal). The only difference between a neutral atom and an ion is one electron, which gets removed with lasers as a part of the trapping process. This process, called ionization leaves the atom with a positive electrical charge and only one valence electron. Each atom of a specific element is perfectly identical to every other atom of that element in the universe, which makes the perfect candidate for a qubit. Once the atom has been turned into an ion, a specialized chip called a linear ion trap is used to hold the ion in a 3D space. This small trap features around 100 tiny electrodes precisely designed and controlled to produce electromagnetic forces that hold the ions in place, isolated from the environment to minimize environmental noise and decoherence.
Most of vendors load a number of ions into a linear chain to run calculations. This provides certain flexibility to create anything from a one-qubit system to a 100+ qubit system without having to fabricate a new chip or change the underlying hardware. Ion traps produce quite stable qubits: once prepared in a particular stable quantum state, they can remain in that state for very long periods of time.
Another benefit of this architecture is its complete connectivity. Because the qubits are not connected by physical wires, all of them can interact in a full mesh with no intermediary steps. This translates to a massive reduction in communication overhead, and by extension, computational noise.
Before the ions can be used to perform quantum computations, they have to be prepared for the task. This has two major steps: cooling (yup, no way around that here either), which reduces computational noise and makes better qubits, followed by state preparation, which initializes each ion into a well-defined “zero” state, ready to perform algorithms.
The technology used to manipulate the qubits state and perform gates varies between companies. Some use an array of individual laser beams, each imaged onto an individual ion, plus one “global” beam. The interference between the two beams produces a beat note that is at exactly the necessary energy to kick the qubits into a different state. This requires an alignment accuracy equivalent to a fraction of the width of a human hair, which makes it extremely difficult to calibrate at scale. For this reason, other companies use a simpler electronic gate technology, applying voltages to a microchip in a similar fashion to how classical transistors operate. This is the microwave technology used in today’s mobile phones.
Once the computation has been performed, reading the answer from the ions is done by either shining a resonant laser on all of them at the same time, or reading the microchip. This process collapses any complex quantum information (superposition) that was previously created and forces each qubit into one of two states. Collecting and measuring this light allows to simultaneously read the collapsed state of every ion. This is interpreted as a binary string, where each glowing atom is a one, and each dark atom is a zero.
To successfully hold complex quantum information, qubits can’t interact with anything. A single tiny atom colliding with an ion can knock the chain out of the trap entirely. So the trap is placed inside an ultra-high vacuum chamber, pumped down to pressure level where there are about one hundred trillion times fewer molecules per cubic inch than the air we regularly breathe. Then the whole assembly goes in an even larger enclosure with a variety of control systems that get hooked to a classical computer running further calibration.
The competition to scale the full qubit interconnectivity, with high gate fidelity and low noise, is fierce. Newer and more powerful versions keep being released, and there is even a paper published containing a blueprint for a trapped ion quantum computer that could realize a limitless and fully scalable design if implemented, detailing how the interconnectivity of components should look like.
Silicon spin qubit
The mainstream school of researchers on this technology are using the outer most electron in a phosphorous atom, and its nucleus, as a qubit. This single phosphorous atom is embedded in a silicon crystal, placed right next to a tiny transistor.
The electron has a magnetic dipole (the spin) that has two orientations, up or down, which could be thought of as the classical one and zero. To differentiate the energy state of the electron when the spin is up or down, it is necessary to apply a strong magnetic field. This lines up the electron spin to point down, which is its lowest energy state. It takes just a little bit of energy to put it into the spin up state. If it were at room temperature, the electron would have so much thermal energy that it would be bouncing around from spin up to spin down and back. So the whole structure needs to be cooled down to only a few hundredths of a degree above absolute zero. Enter cryogenics. That way we know that the electron will definitely be spin down, as there is not enough thermal energy in the surroundings to flip it the other way.
To write information onto the qubit, the electron is put into the spin up state by hitting it with a pulse of microwaves. That pulse needs to be dialed to a very specific frequency that depends on the magnetic field that the electron is sitting in. This allows for full control of the state, where it is possible to set the spin not only to zero or one, but stop the pulse at the specific point where superposition of both states happen. Quantum Physics meets engineering.
To read out the qubit information it is used the transistor that the atom is embedded next to. If the electron is pointing up it can jump into the transistor because it has more energy than the base where it sits, leaving behind the bare nuclear charge of the phosphorous. Here is where things get even more interesting. If we can take advantage of the electron spin to create a qubit, why not doing the same with the atom nucleus? Like an electron, the nucleus has a spin, although it is 2000 times weaker than the spin of the electron. But still it is possible to write into it using electromagnetic radiation, only it needs to be a longer wavelength and a longer pulse in order to get the spin to flip.
Because it is so small, so weakly magnetic and so perfectly isolated from the rest of the world, this is a qubit that lives for a very long time, making it an attractive base for quantum computer technology. However, it is paramount to eliminate any trace of spinning from the silicon crystal itself, which would react with the electron spin of the qubit and ruin the whole energetic balance. Natural silicon contains about five percent of the isotope silicon 29 and that does have a spin. But the beauty of silicon is that it has this isotope called silicon 28 that has no nuclear spin. In other words, it is a completely non-magnetic atom. This is an intrinsic advantage compared to other commonly used materials such as III/V materials.
One of the best features of these qubits is that makes quantum computing compatible with existing microelectronics, so there is a significant potential for scaling/clustering and massive production. Internationally renowned researchers are putting their efforts here, there must be a good reason for it.
A perfectly pure diamond crystal contains nothing but carbon atoms, arranged in an orderly lattice. About one percent of the atoms in a diamond are carbon-13 (the predominant are carbon-12 atoms), which have non-zero spin so it is possible to encode a quantum state onto them. By forcing out a single carbon atom and replacing it with a single nitrogen atom, this well-ordered world is disrupted and a nitrogen vacancy center can be created.
To do this you need a tiny custom-grown diamond, less than half a millimeter in diameter, which contains the so-called nitrogen-vacancy centers (NV-centers). You can think of an NV center as a tiny defect in the diamond crystal. In such a center, two adjacent carbon atoms are replaced by a single nitrogen atom and a vacancy. The vacancy can hold an extra electron, serving as a qubit, whose spin can be controlled with microwave pulses and which can be read out using a laser.
The idea of creating a qubit out of a diamond crystal evolved from the research on biomolecule detection in the medical diagnostics field. I haven’t seen a huge amount of consistent focused research happening at the same velocity of other technologies, but it certainly has its advantages and we shouldn’t discard it.
The use of topology to protect quantum information is well-known to the condensed-matter community, and one of the competing avenues to demonstrate that quantum computers can complete certain problems that classical computers cannot. Topology is a branch of mathematics describing structures that experience physical changes such as being bent, twisted, compacted, or stretched, yet still maintain the properties of the original form. When applied to quantum computing, topological properties create a level of protection that helps a qubit retain information despite what’s happening in the environment.
The main players on this space have been bending (pun intended) the physical properties of photons to reach a protected state; also electrons are further subdivided to achieve extra protection in the form of redundancy. For instance, when an electron is split, the resulting sub-particles behave similarly to data redundancy, so if there is some interference, there is still enough information stored in the other half to allow the computation to continue. Another phenomenon is the ground state degeneracy, a feature by which a quibit has two ground states which makes it more resistant to environmental noise. Topological systems can measure the difference between these two states, allowing them to achieve this additional protection.
Error correction is a critical part on this system, as these qubits are extremely sensitive to any kind of quasi-particle floating around the environment, so the challenge to move from paper to reality is to work around that noise. Most of the published literature stays in conjectural theory and research, but to my knowledge there is not an actual blueprint out there that describes how to build this type of computer. A machine that can execute a given algorithm at speed and with low noise is still to be seen.
Some of these companies might claim they have the best qubit, but the truth is that nowadays we cannot really say “this way won’t work” and discard it completely. We just don’t know. Scalability is probably the hardest issue in all designs because there are lots of system engineering constraints (temperature, coherence time, gate fidelity, error correction...), and I am sure there are features that humans have not invented or even imagined that could change the landscape dramatically.
Until that day comes we will keep using Scott Aaronson’s definition of Quantum Computing: “The study of what we can’t do with computers we don’t have”. Seeing the lightspeed progress (or more!) in this field, that definition might not stand valid for much longer, and that is hugely exciting.