If Santa Had A Quantum Computer

Patrick Pallagi
7 min readDec 8, 2022

--

Image Credit: Dream by WOMBO / dream.ai

The longest list that I could imagine is Santa’s list of gifts for children. I wonder how long could that be! This kind of thinking got me into thinking about the question: How long could a list become? And now, what I really find an interesting exercise to do is to tackle the sort of question that’s right on the edge of quantum physics and computer science:

I would like to know if it possible to hold more items in an array or a list than the number of atoms in our universe. Could a quantum computer really do that?

First, we’re going to start with Wolfram Alpha!

Wolfram|Alpha is a computational knowledge engine for computing answers and providing knowledge. It is is a web-based service that runs on a large centralized compute cluster. It is the creation of Stephen Wolfram and his team and it is seen as primarily as a language tool that helps translate questions primarily in mathematics and physics.

When I use Wolfram Alpha to estimate the number of atoms in our Universe it tells me it is roughly 6 times 10⁷⁹ which I often round up to 10⁸⁰.

My nephew has asked me: “Isn’t the computer going to have trouble calculating numbers bigger than the number of atoms in the Universe ?”

And the answer is: “Yes and No.” The Wolfram language uses the principles of mathematical representation. To illustrate this example think about the human brain.

How many brain cells do we?

Humans have an average of 86 billion neurons.

Does this stop us from counting to 86 billion plus one? No! We can count from zero all to way to bigger and bigger numbers because we can represent these numbers and voice them into order. We can also write bigger numbers down, which is another way of representing number. So we can conclude that our initial memory does not limit how big numbers can get in our representation. Numbers can become as big as we can imagine them to be.

What I’m interested in is what is the biggest possible size of an array for a quantum computer theoretically or in other words how many items we can confidently hold at the same time in one system.

One great example is, Dave Farrow who memorized on single sighting a random sequence of 59 separate packs of cards (3,068 cards) at CTV Studios in Toronto, Canada on 2 April 2007. The attempt took 4 hrs 58 min 20 secs not including breaks.

Another great example is Akira Haraguchi, who in 2006 recited 100,000 digits of pi from memory at a public event near Tokyo. It took him 16hrs 30mins.
This feat makes him the master pi-man, even though the Guinness Book of records has not validated his record.

So the question is still being asked:
Could we memorize more items than the number of atoms in our universe?

It seems as though if we think solely about mathematical representation we could say we know numbers very well and therefore if we start counting integers from zero we can surely get to a number bigger than the number of atoms in this universe.

Assuming we started counting integers since the Big Bang, we would have only had brief amount of time. Since we know that the Observable Universe is about 4,3 * 10^17 seconds young we would have to budget all the 6 times 10⁷⁹ integers into that time frame with no breaks. How many integers would we count per second then? It would be 1.39 x 10⁶² counts per second. This just goes to show that we are living in a young universe.

So now I think we are convinced that it would be pretty extraordinary to be able to hold more items in memory than the number of atoms in our universe.

And yet, the question becomes even more interesting when we think about quantum computing. My intuition tells me that it is worth a shot having a look at a qubit infrastructure that may be able to hold just the right amount of items to confirm our hunch.

To hold a number we need bit. A bit can be zero or a one and it is great invention that can encode a lot of things when put into a series. A series of bits make up the code that travels through the ocean to connect you with other people who love poetry and music just as you do. The language of computers has been written in bits. And a byte is a data unit comprising 8 bits, and is equal to a single letter in one of the words you’re reading now. Bits define structures around us for computers to use and build on top of.

Do you know about 64 bit vs 32 bit Processors and Operating Systems?

Simply put, a 64-bit processor is more capable than a 32-bit processor because it can handle more data at once. A 64-bit processor can store more computational values, including memory addresses, which means it can access over 4 billion times the physical memory of a 32-bit processor. That’s just as big as it sounds.

So now the question is instinctual:
What is the difference between 32 qubit processor and a 64 qubit processor?

To get to know the answer we will take a look at Holevo’s theorem is an important limitative theorem in quantum computing.

In essence, the Holevo bound proves that given n qubits, although they can “carry” a larger amount of (classical) information (thanks to quantum superposition), the amount of classical information that can be retrieved, i.e. accessed, can be only up to n classical (non-quantum encoded) bits. It was furthermore explicitly established, both theoretically and experimentally, that there are computations where quantum bits do indeed end up carrying more information than is possible classically.[3] This is surprising, for two reasons: (1) quantum computing is so often more powerful than classical computing, that results which show it to be only as good or inferior to conventional techniques are unusual, and (2) because it takes 2^n complex numbers to encode the qubits that represent a mere n bits.

So how come we are living through the quantum revolution?

The thing that we should understand is that qubits are way more powerful than bits, but in the end you can only extract a small amount of information from it.
Theoretically, you could store all of Shakespeare’s works on a single qubit. But then you ask the qubit a single yes/no question about Shakespeare’s works (“are there more ‘e’s than ‘s’s in the corpus?”) , and it forgets everything it knew about Shakespeare except the answer to that single question.

So a quantum chip is very good at running a certain kind of selective algorithm that potentially selects from all the possible options it has on it.

Here is a table giving us a comparison between classical and quantum bits at different scale. And that’s part of why many people have made the claim that quantum computers are for specific questions only.

When you measure it, the output is as many bits as the logical qubits you have in your system. When you create the dataset in the superposition you can create as many items as two to power of the number of logical qubits.

And if you want to find the piece of data that you are looking for you can easily find it in an instant. The result will be the singular result that you have been looking for and your search query is complete.

So finding out whether you are on a given list or not is easier with a quantum computer.

Let’s say you want to know before this Christmas whether your gifts are on Santa’s list or not.

Here’s a comparison for how much information a qubit and a classical bit hold.

So for New Tork City with the population of around 8.8 million people Santa would need 8.8 million bits which is roughly a megabyte or storage. And if he decides to use his quantum computer he would need 24 qubits only.

More surprisingly, for Santa’s list to include literally eight billion on Earth he would only 33 qubits.

So back to our original question:
What is the difference between 32 qubit processor and a 64 qubit processor?

Look at this way! With 33 qubits Santa can make room on that list for the entire population of our planet.

With 64 qubits he would be able to create a list that is literally 2 billion times that long.

That’s how powerful this new tech is!

Remember what I said in the beginning? I would like to know if it possible to hold more items in an array than the number of atoms in our universe. Could a quantum computer really do that?

So now we see how the answer is hypothetically yes.

And It would take 266 qubits.

What a list that would be!

And with that said I wish you Merry Christmas and a Happy New Year!

Thank you for reading!

--

--