My Life in Quantum Computer Music

Alexis Kirke
The Riff
Published in
10 min readMar 31, 2021

--

Preparing for a performance of Cloud Chamber at California Academy of Sciences

This is the story of how I went from reading books on quarks and cosmology as a young teen to dueting with cosmic rays, and then with quantum computer AIs via teleportation. A journey that has taken me through the outer reaches of my own imagination, through performing for some of the world’s most prominent theoretical physicists, to being taken to see the remains of the very machines that discovered quarks in government labs, to having my violin music played by Radiohead’s violinist and in front of a high-security hyper-radioactive neutron bombardment chamber.

A long long time ago, in a galaxy far far away I was not a very enthusiastic schoolboy. Then…when I was 15…I came across a book on particle physics left on a teacher’s desk. A dazzling array of quarks and quantum chromodynamics. Just the names dazzled me! This led to the teacher lending me another book on supersymmetry. (Which led on to the rather embarrassing incident at 16 years old where I was banned from a local university library for trying to smuggle out a quantum physics textbook.)

I remember first reading about the quantum vacuum state in which the uncertainty principle of energy led to the creation and destruction of vast numbers of invisible particles and anti-particles at vanishingly small time-scales. And they were all around me.

I was thrilled to discover that a device existed that could make subatomic particles visible!

Preparatory Steps

John Matthias at the Premiere of Cloud Chamber (2011) — the chamber projection is on the rear wall.

As a precursor to my later – and more truly - quantum music, ‘Cloud Chamber’ was a live performance I did in 2011 with violinist John Matthias (who did the strings on Radiohead’s “The Bends”). I created and composed “Cloud Chamber” to make the invisible quantum world visible, as the violinist and subatomic particle tracks dueted together for an audience. An instrument was developed (with the help of Nick Fry, Antonino Chiaramonte, Anna R. Troisi and Eduardo R. Miranda) which could be “played” live by atomic particles.

Electronic circuitry was developed which enabled a violinist to use their instrument live to create a physical force field that directly affected the ions generated by the particles. This enabled the violinist and the ions to influence each other musically. The whole performance was based around a special piece of equipment which made neutrons, muons etc visible in bright white tracks moving in a glass chamber. These tracks were projected onto a large screen for the audience.

The California performance of Cloud Chamber

The chamber was placed on stage, saturated with ethanol and cooled by liquid nitrogen, which made the cosmic radiation become visible. John Matthias played the violin connected electrically to the glass chamber. So the music of the violin was both heard by the audience and “heard” by the cosmic ray generated ion patterns in the glass chamber.

If John played in one way the ion particles behaved in a certain way, and if he played in a sufficiently different way, the particles behaved differently. These particle movement changes affected the electronic sounds the radiation generated through their own special instrument; thus there was a live musical interaction between the atomic world and the violinist during the 15-minute partially-scored performance.

The basic set-up for a cloud chamber

The score itself was based on the quark structure of Bosons and data provided by the Neutron bombardment accelerator at Rutherford-Appleton Laboratories. Performances were also done at Rutherford-Appleton Labs with Ben Heaney on electric violin and at the Lepton-Photon Conference banquet at the California Academy of Sciences with Alina Polonsk on violin.

My First Entanglement

Entanglement Diagram for Photonic Quantum Gate at Bristol University (2014)

My next investigation was when I discovered the first cloud quantum computer — set up at Bristol Photonics Lab by Pete Shadbolt (now at Imperial College). The quantum computer could be used to create a state of entanglement in a controlled way. Such an idea seemed so exciting to me at the time (it’s fairly easy to do it online now, 7 years later!) One other element that was fascinating about the Bristol computer was it used photons — light particles — to do calculations.

Representation of the Bristol Photonic Quantum Computer

I used the orchestra simulator in Logic Pro to generate music based on entanglement done in this quantum system. Four melodic lines are generated using a simulated version of the computer-accessible at Bristol Centre for Quantum Photonics. The four lines start on a string quartet: the cello line whose pitch is based on the Bell CHSH value (which measures the entanglement level of photon outputs). For a given Bell CHSH value (lasting 5 bars), the other line pitches are calculated from: (a) the correlation values that sum to give the Bell CHSH; (b) a sample of the photon coincident counts that sum to give the correlations.

The arrangement algorithm adds new instruments as photon entanglement increases. The classical limit is breached when the double bass comes in, maximum entanglement (between photons) is reached when the trumpet comes in at the end. You can check it out here:

The concept of the Bell CHSH and correlations is mathematically rather complex to explain. A very simplified view would be that it is possible to create a joint quantum system and then separate its elements. In classical physics, this would be two particles that would have their own laws of behaviour. However in quantum mechanics, the two particles are still a joint quantum system. So if set up in the right way, they can instantaneously influence each other's behaviours (to a degree). This is entanglement. Einstein believed the possibility of such a joint quantum system proved that quantum mechanics was flawed. But then years later it was shown that such systems existed; and that Einstein was wrong!

Quantum and In-Control

One frustration for me was that the actual hardware version of the Bristol QC was taken offline before I could do very much, so I had to work with a simulator. I was still keen to create real hardware entanglement and superposition. There were no gate-based quantum computers available at the time that I could access. But I did manage to get hold of a controlled quantum device — a Quantum Annealer.

I made contact with Geordie Rose -the founder of D-Wave computing. This was a fantastic opportunity for me in 2015. A composer who could get controlled access to quantum processes in real-time! The D-Waves are more accurately described as quantum optimizers than quantum computers. And I know there has been debate about their relationship to gate-based quantum computing and the quantum advantage. But two things were for sure: they were an amazing engineering achievement, and I could use their API to send commands to set up quantum states of superposition and entanglement in actual hardware!

The campus of USC (University of Southern California)

I had a two-week residency at the University of Southern California in Los Angeles to work with Prof Daniel Lidar on their D-Wave 2X. I developed a simple algorithm that allowed the D-Wave — when sent a melody — to generate musical chords. I called it qHarmony. Shortly afterwards I was sitting in my front room in Plymouth and triggered the first live generation of harmonies.

This was a wonderful moment for me. I was hearing the harmonies generated live by controlled quantum processes. In addition, the chords were built up into “superchords”. Because it was a quantum device, a D-wave would return multiple possible chords to solve a particular melody. I then piled these chords on top of each other to make larger chords. Thus representing the quantum nature of the D-Wave. Here it is:

This inspired a great deal of creativity in me, and led to a performance by celebrated mezzo-soprano Juliette Pochin (of Opera Idol and Proms in the Park fame) in the first live quantum music performance — at the Port Eliot Music Festival, linked up live with USC D-Wave. The video below describes the processes involved. The score Juliette sang was inspired by the tragic Greek mythological figure Niobe (after which the superconductor Niobium was named — used in the D-Wave Qubits).

Video about the first live quantum music performance with Juliette Pochin

I had one more flirt with the D-Wave 2X a few years later. I connected two people with EEG braincaps (brainwave monitors) to the D-Wave machine and entangled their brainwave readings to generate a sonic performance.

Gate-based Quantum Computers

In 2016, I was thrilled to join IBM’s early quantum computer access scheme. Finally, I could use actual gate-base quantum computers. The types of QCs that governments and funders around the world were pouring billions into. Finally I could play around with their hardware — as opposed to the simulation I used at the Bristol Photonics Lab.

My first stumbling music system was published in 2018 — called qMEL. It could produce very simple melodies. I immediately combined it with my D-Wave system qHarmony to create qGen. qMEL would generate a melody on the ibmqx4 machine and qHarmony would add harmonies on the D-Wave 2X.

Quantum circuits for qgMuse implementing the rule (LI(t-1) ⨁DC(t) )’∙LI(t-1)’.LI(t)^’=1 on the ibmqx4

The actual qMEL system did not take advantage of the super-speed potential of gate-based quantum computers. So I then implemented a version of Grover’s algorithm on the ibmqx4 (inspired by a presentation by Yidong Liao). Grover’s algorithm is one of the four “killer apps” for quantum computing. My Grover — qgMuse — used a couple of simple music rules to generate melodies. In this case, the Grover algorithm wasn’t any faster than the classical version. But for much more complex musical problems it would be quadratically faster than a traditional computer algorithm.

In 2019 I had a bit of fun using the Grover qgMuse and my old D-Wave harmony algorithm. I made a dance music track using the algorithms:

I did another dance track after I went to the first IBM Qiskit camp in 2019. IBM threw a party at the big APS (American Physical Society) conference and the DJ dropped my track! It’s called Qubit Snapshot and it sonifies the Bloch Sphere rotations in a quantum circuit on an IBM

And Finally…Teleportation

Probably the research paper I’m most proud of was the one that led to a “world-first” in May 2019 – the first live performance with a gate-based quantum computer. The basic idea was to allow a whole bunch of the qgMuse algorithms from earlier to communicate with each other, and a human and a classical computer music algorithm — and for all of them to jam in parallel to create music.

I was able to do this when I was given priority access to an IBM quantum computer with 14 qubits (unlike the 5-qubit ibmqx4). One problem was that quantum information cannot be transmitted over normal wires. The only way to transmit a quantum state is via Quantum Teleportation. Thus I implemented a quantum teleportation system to allow the two qgMuse “agents” to communicate. Each agent had a slightly different musical style, based on the Grover system they had to solve.

qHMAS — my Quantum Hybrid Multi-agent Architecture

The resulting jamming experience was recorded here. I was the human piano player, and I used variations on themes from Game of Thrones in my playing (hence the performance’s title):

As an aside, this performance was published as an episode of my podcast “My Quantum Computer Wrote a Podcast”. I did various episodes while running algorithms live on the IBM Melbourne Q14 to demonstrate various quantum principles. My favourite is where I do a rap of the quantum teleportation algorithm math proof over some dance music I created:

I’m sure that as the years go on, much of my work in quantum computer music will seem primitive. When I was doing it there were very limited algorithms available on quantum hardware, so I had to learn and write most of them myself.

But it has been one of the most rewarding research topics and composing inspirations of my career. I hope my past work has helped to lay a foundation for a rich future in quantum computer music.

References

Kirke, A., Miranda, E., Chiaramonte, A., Troisi, A. R., Matthias, J., Fry, N., … & Bull, M. (2013). “Cloud Chamber: A Performance with Real Time Two-Way Interaction Between Subatomic Particles and Violinist.” Leonardo, 46(1), 84–85.

Kirke, A., Shadbolt, P., Neville, A., & Miranda, E. R. (2015). “A Hybrid Computer Case Study for Unconventional Virtual Computing.” International Journal of Unconventional Computing, 11(3–4), 205–226.

Kirke, A. (2018). “Programming gate-based hardware quantum computers for music.” Музикологија/Musicology, (24), 21–37.

Kirke, A. (2019). “Applying Quantum Hardware to non-Scientific Problems: Grover’s Algorithm and Rule-based Algorithmic Music Composition.” International Journal of Unconventional Computation 14(3–4) Old City Publishing.

Kirke, A. (2020). “Testing a hybrid hardware quantum multi-agent system architecture that utilizes the quantum speed advantage for interactive computer music.” Journal of New Music Research, 49(3), 209–230.

--

--

Alexis Kirke
The Riff

Alexis Kirke is a screenwriter and quantum/AI programmer. He has PhDs from an arts faculty and from a science faculty. http://www.alexiskirke.com