What is Interactive Music?

Teresa Marrin Nakra
2 min readJan 23, 2017

--

For my first contribution to Medium, I thought I should address a topic that is relevant to a course I teach at The College of New Jersey: Interactive Music Programming. In this class, we learn how to write computer code that enables us to interact meaningfully with music. Every semester, my students present their interactive music systems for the public. These performances feature new compositions, improvisations, arrangements of existing works, and delivery systems for music learning.

But before getting into how we make all of this possible, I need to first establish what interactive music is! After all, most music is highly interactive already. So what does it mean to create interactive music? According to Robert Rowe (Interactive Music Systems: Machine Listening and Composing. MIT Press, 1992, page 1): “Interactive computer music systems are those whose behavior changes in response to a musical input.” According to this definition, merely pressing play on your favorite playlist is not interactive. So what are the minimum requirements for interactivity? What counts as responding and changing based on input?

For music to respond and change flexibly to input, we need to use the technique of musical parameterization, or the creation of abstract, numerical structures from procedural descriptions and algorithms. The better the algorithms one can write, the more intelligent, autonomous, and responsive their systems will become. Ideally, such systems don’t just spit out streams of robotic notes, but also can improvise, compose, and perform with sensitivity, fluency, and emotion. And they can function at the level of description and cognition required by the human needs of the situation at hand.

Here are some examples of interactive music systems that I admire:

· Zoe Keating (cello & electronics with Max), Escape Artist

· Smule’s Ocarina app for the iPhone

· Smartmusic (by Roger Dannenberg)

· MUSE synthesizer (invented by Marvin Minsky and Ed Fredkin — a device for not just generating sound, but composing and improvising by creating streams of note patterns, allowing the user to set the parametric relationships via sliders and buttons.)

As our development platform, we use Cycling ’74 Max, MakeyMakey kits, and a range of other sensors, controllers, and devices to help us discover the possibilities for making interactive, responsive music that we all can enjoy! Here’s a video of one of our Music to the Max class performances. Stay tuned…and keep an eye out for my next post on interactive conducting systems!

--

--

Teresa Marrin Nakra

Associate Professor of Music & Interactive Multimedia at TCNJ (music technology, affective computing, expression, gesture)