Quantum Computing will Propel the Next Era of Data Insights and Usefulness
For decades, the power of computers has progressed in a seemingly inevitable fashion, tackling more and more of the challenges our world has to offer. However, many of the world’s most impactful questions have evaded this progress. Materials development, biological simulations, pharmaceutical simulations, delivery network optimizations, and weather models are a few examples of areas where the messiness of real-world data has made it harder to make computational progress. These multidimensional problems are very hard to distill into bits, the 0s and 1s that traditionally represent the way information is encoded in computers. While today’s supercomputers can calculate decent approximations, physicists herald the coming of a novel computing paradigm, quantum computing, that promises to speed up and improve the accuracy of computations — not by 50 or 100%, but by many orders of magnitude. With quantum computers, problems that take years to solve on today’s supercomputers can be calculated in minutes, which radically expands the possibilities of modeling the world.
The key to quantum computing is making use of two key properties of quantum physics, superposition and entanglement, in order to enable computations of much higher complexity with a very small number of computational units. Here’s one way to visualize the difference between classical computers and quantum computing: imagine a grid of coins on a table that are either tails (0) or heads (1). Quantum computers take those coins, place them onto their edge, and flick them to start spinning. When the computation is finished, the coins fall back down and are read out as regular 0s and 1s.
The coin analogy makes apparent a core challenge of quantum computing: it’s really hard to get hundreds of coins spinning precisely in place all at once. Some might bump into each other and fall; others might get caught on an imperceptible scratch in the wood and careen out of their place. When they’re each up and spinning, it’s just a matter of seconds (in the case of quantum computers, sometimes milliseconds) before they collapse. It doesn’t leave much time to calculate the answer. As you can imagine, this problem doesn’t get easier when you go from 1 coin to 2, or 50 to 500. Quite the opposite — with more coins, you run more risk that some will bump into each other, so you need more sets of fingers to try to precisely spin each one without ruining the whole system.
These delicate intricacies have made quantum computing a challenging space for early pioneers of the science. The coin in our analogy is more commonly referred to as a “qubit” (a quantum bit) by practitioners, and it has turned out to be quite hard to build quantum computers that efficiently scale the number of simultaneous qubits. The approaches that were the easiest to build 1–100 qubits with, such as superconducting Josephson junctions and ion traps, are likely not going to work at the scale of 1,000 or 1,000,000 qubits, yet this is where we need to get to if we want to solve meaningful problems. Doing so requires an architecture that is designed from the start around scale, and which can natively handle the exponential increase in interference that occurs as the number of qubits increases.
Atom Computing, based in Berkeley, brings together a world-class team of academics, including co-founder Ben Bloom and Jonathan King, alongside industry veterans Rob Hays and Bill Jeffrey as the company’s top executives. Atom Computing is building the world’s first large-scale, programmable quantum computer and they’re using an emerging platform called neutral atoms. These quantum computers are built around individual atoms (in their case, Strontium) that are suspended in a vacuum chamber using a grid of laser light. Additional, precise pulses of light turn these atoms into qubits and get them to perform operations, and then a specialized CMOS sensor (like one in your digital camera) records the output.
Neutral atoms have many encouraging properties compared to other types of qubits. Think again of the coin analogy: neutral atoms easily stay spinning for a long time (a long “coherence time”) and don’t tend to bump into, or “interfere,” with each other. The key insight that Atom Computing’s cofounders realized was that massive advances in the cost, availability, and functional capabilities of underlying hardware — driven by a variety of scientific and industrial use cases (such as powerful, cheap, software-defined radios, lasers, optics, and FPGAs) — made it possible to have neutral atom computers that are scalable as well as commercially feasible, thanks to the company’s ability to use off-the-shelf components.
Free from the need to invent custom hardware, Atom Computing has already been able to rapidly iterate on its system design. In less than two years, they’ve gone from an empty room to a first-of-its-kind computer capable of trapping 100 atoms each one, a nuclear-spin qubit with performance levels never before reported in any scalable quantum system.
Innovation Endeavors first partnered with Atom Computing in their seed round, and we are thrilled to now lead their Series A along with our good friends at Venrock and Prelude Ventures. We are excited for Atom Computing to make quantum computing a reality, and we look forward to the discoveries that their quantum systems will enable.