The Quantum Data Center — What it is and Why We’re Building It.

The Quantum Data Center….Sounds very buzz-wordy, but it’s actually far more complicated than just plugging an ethernet chord into a Cryo Fridge.

Dominik Andrzejczuk
The Quantum Data Center
5 min readOct 5, 2021

--

How exactly do you take classical data, translate it into quantum instructions, send those instructions to a QPU and in less than a microsecond, receive back quantum information, translate it back into classical information, and then display it to the user, and do it with virtually no margin of error? (If that sounds really hard, that’s because it is.)

The process of converting classical data into quantum data is one of the most overlooked, if not one of the most important parts of the quantum stack. And by overlooked, I mean, something that investors are not focusing on enough. Sure, qubit quality is incredibly important, as is the Quantum Processing Unit, but by not focusing on the control hardware that controls said qubits, that’s like skipping over the transmission of a car and only focusing on the engine.

In order to build the world’s first Quantum Data Center, we must first analyze why control systems are such an important component.

Quantum Control Systems

Photo Courtesy of IBM Quantum

Take a look at the image above…in the red box you have the quantum processor, QPU for short. And in the yellow box, you have an entire rack of classical modules. This Rack is the QPU’s control systems, and in many ways, is as important as the QPU itself. This monstrosity of a system, is what converts classical data into quantum data and vice versa. Without it, good luck plugging your quantum computer into an ethernet jack.

I wrote an article several weeks ago, talking about how noise and qubit quality affects overall quantum computing performance. One way to address a lot of that noise is actually through these control systems.

We need to develop new ASICs that are tailormade for sending and receiving quantum information more effectively and new control algorithms to further reduce instabilities in magnetic fields, external disturbances, laser noise, etc.

We also need this rack so the rest of your datacenter can actually talk to a quantum computer.

ARTIQ & SINARA — Open Source Control Platform

ARTIQ (Advanced Real-Time Infrastructure for Quantum physics) & SINARA are leading-edge control and data acquisition systems for quantum information experiments. Many laboratories around the world have adopted ARTIQ as their control system, with over a hundred Sinara hardware crates deployed. ARTIQ & SINARA are both open source projects, whose control systems primarily cater to the Ion Trap and Neutral Atom community. They are deployed within institutions such as Universität Innsbruck, University of Oxford, University of Maryland, NIST, MIT and many more.

Within the industry, Oxford Ionics, AQT, ColdQuanta and Atom Computing are among some of the users controlling their qubits with these control systems. This standardization of the control hardware, is similar to the standardization of Motherboards in classical computers decades ago. This kind of standardization is essential if large scale quantum computing is to be realized.

Quantum Computing Should be Plug and Play

With the control hardware slowly being standardized, all that is left is the software layer that defines the interface to the device. In essence, what is required is a “Docker Container” of sorts, that allows any user to plug into any Quantum Device on the ARTIQ SINARA platform. The Quantum Device can physically reside in the data center, or it can be remote. All the user needs to do is supply the IP address of the device, and viola, you have a Quantum Data Center at your finger tips.

Writing Quantum Applications Should be Intuitive

Users who wish to deploy algorithms on this machine should only be concerned about how well their algorithms are written, and not what parts of the algorithm are run on quantum vs. classical devices. Another massive challenge in today’s NISQ (Noisy Intermediate Scale Quantum computing) era is that users utilizing quantum computers need to optimize their algorithms so that the instructions are properly distributed to the right hardware. This provides for a poor user experience and a high barrier to entry, even in Hybrid Quantum Classical Regimes. Writing and deploying quantum algorithms needs to be intuitive, so that user‘s can adopt it quicker.

In this entire stack, what is tantamount is that both hardware and software are meticulously orchestrated, such that any noise or errors are eradicated. This leaves a massive opportunity for the creation of intellectual property, both on the hardware and software side of things. The entire system itself, from QPU to the software interface, can then be delivered to any datacenter around the globe. Whether you’re AWS or a Quantitative Hedge Fund with its own bare metal, users will be able to quickly plug and play any quantum device from the ARTIQ SINARA ecosystem. Users will then be able to take advantage of intuitive Quantum Applications that are shipped with these appliances.

This is why it’s so important to start building a Quantum Data Center today. There are endless challenges that need to be solved but luckily, a lot of these are engineering challenges. I am therefore happy to announce that we are building one of the world’s first Quantum Data Centers, right here in Warsaw, Poland.

Courtesy Nvida — DGX A100 Superpod

Together with Nvidia’s DGX Superpod, we will be building the data center of the future. The Superpod will allow us to simulate qubits, and run the classical parts of our applications on hardware that is optimized for it, all done in a sustainable, carbon-neutral way.

For the next 5–10 years, Quantum-Classical regimes will be the norm, and having state of the art classical hardware is paramount to providing the best possible developer experience. We will also need to run most of our quantum algorithms today on simulators, given that the majority of quantum devices are not yet ready for production. This is why the Nvidia DGX Superpod is an essential component to get to full scale commercial quantum computing.

The establishment of the Quantum Data Center will address almost every aspect of the quantum stack, starting with the QPU and moving all the way up to the User Interface. By working on all aspects of the stack, we will establish an ecosystem here in Warsaw and begin to break ground for what will eventually be known as The Quantum Valley.

--

--