Making light of quantum machine learning

Emojis, light music, and other applications of quantum neural networks 🔥 🔥 🔥

By Juan Miguel Arrazola, Thomas R. Bromley, Josh Izaac, and Nathan Killoran

At Xanadu, we’re working hard to make quantum computing and quantum machine learning a reality. We’ve recently developed a new light-based quantum neural network scheme which can be used to tackle challenging scientific and technical problems. But after a hard day’s work, sometimes you just want to let loose and get creative.

This blog post will be all about having fun with our new toys. Let’s start with a little “light music”: grab your headphones!

Making music with quantum light

Quantum neural networks can be used to transform simple quantum states into more complex ones. In our recent paper, we trained quantum circuits that could convert laser light into states of a fixed number of photons. By appropriately selecting the brightness and phase of the incoming light, we can generate states of one, two, or three photons. Instead, by inappropriately selecting the brightness and phase of the incoming light, we can generate completely new kinds of quantum states that can interpreted as musical instruments.

Quantum instruments.

How does a quantum state make a musical instrument? The frequency of a wave determines its pitch, i.e., whether it sounds like the note C or the note A. The shape of a sound wave dictates its timbre: the difference between a piano or an electric guitar playing the same note.

Each quantum state of light is characterized by a unique wavefunction whose shape determines the timbre of its associated instrument. We can ‘play’ a quantum state by generating a sound wave built by repeating the shape of the wavefunction at a desired frequency.

Our quantum neural network can therefore be used to discover new instruments: by interpolating between the wavefunctions of known quantum states, we can discover new wavefunctions, and thus, new instruments

We experimented with this idea by choosing different kinds of input laser light and playing the resulting quantum instruments. After a careful selection process, Juan Miguel presents to you his creations: the Heisenbass, Diraclarinet, and Hilbertsichord.

Heisenbass (left), Diraclarinet (middle), and Hilbertsichord (right)

We started a band, Schrödinger’s Lonely Hearts Club Band. For auditions, the task was simple: play the tune of “Strawberry Fields Forever”.

Here’s Nathan auditioning on the Heisenbass
Josh playing the Diraclarinet
and Tom on the Hilbertsichord

Listen to the three of them auditioning together:

The band was formed and we jammed some songs, recording a couple of audio snippets from these sessions. Hear the band musing about their time developing Xanadu’s quantum programming language:

Blackbird

and playing Naaaaaaaaa naaaaaaa naaaaaaa naanaanaaanaaaaaaaaaaaaaa


Tetrominos and Emojis

A quantum neural network can also be trained to generate images by transforming light from two input lasers. At the output, we count how many photons appear in each of the two channels (known as modes) and record these results in a grid. For example, the top-most square in the grid below appears when we detect one photon in the first mode and three photons in the second mode.

Every time the experiment is run, we can add an event to one cell of the grid. Eventually, a pattern builds up, and we are able to turn the grid into an image. What images shall we produce?

To start off, we trained our quantum circuit to output Tetris blocks. The goal was to output a different Tetris block for each choice of input to the circuit. You can imagine this as an unnecessarily complicated mechanism for generating the blocks in a game of Tetris — perhaps the first quantum Rube Goldberg machine.

Take a look at the results:

Did you know: the blocks in Tetris are called Tetrominos?

Let’s take a closer look at how the images are generated.

On the top, we see a simple target image. On the bottom, we show how the same image is built up from the output of the quantum network.

For each frame in the animation, an output is sampled from the quantum device, and the relative brightness of the corresponding pixel is increased slightly. Over time, these detections build up a pattern which is nearly identical to the desired image.

In fact, images are represented by encoding the intensity of pixels into in the amplitudes of the quantum state. This uses a quantum property called superposition. You might have heard of this phenomenon before — it is the central premise of the Schrödinger’s cat thought experiment.

The image below shows the result of training our quantum neural network to produce the cat emoji 🐱. What better way to give a nod to Schrödinger!


All of these examples were created using our quantum software library Strawberry Fields, and are available in our Quantum Neural Network GitHub repository. Download it and see what fun things you can create!