Building a Live Custom Audio-Reactive Visualization in Touchdesigner

When I first downloaded iTunes a long time ago, I would spend hours transfixed by the visualizations they provided along with the software primarily used to organize and play my MP3’s.

Image for post

It was the first time I saw music I loved represented as an intriguing visual form. I loved it.

I’m not sure what software tools they used to make these visuals, but what I can tell you is that creating your own custom audio-reactive visualization is dead-simple now, using a program that has become one of my favorites to work with: Touchdesigner.

Touchdesigner is an incredible visual coding platform that allows you to create stunning real-time projects and rich user experiences.

Image for post

I love it because it is:

  • Visual and node-based
  • GPU Accelerated
  • Works with Python3 for more customization
  • Runs in real time (refreshes every 60fps)

There are many inspiring projects (1, 2, 3) I’ve seen built with this software that have blown me away.

I’ve used it for creating generative art, 3d graphics, audio-reactive visualizations, data visualizations, and much more. See some of my work here.

Creating beautiful 3d graphics is easy as dragging and dropping.

Interestingly enough, this type of node-based UI is starting to be implemented in other pieces of software for the web — see http://nodes.io and https://cables.gl/home

What you need for this project

  • The latest build of Touchdesigner
  • A computer with a decent graphics card
  • Audio — This can be an mp3 downloaded from the web or you could just use your phone’s audio as an input (i.e. a song from spotify or apple music)

A General Touchdesigner Workflow

  1. Open up Touchdesigner
  2. Right-click to add components / nodes
  3. Modify parameters in the properties box
  4. Wire them together by clicking the edges of the nodes
  5. Click on the blue circle in the box you want to visualize or save the results to an image file or movie

It’s really this simple ^

Of course you can build more complex things, using custom python code or GLSL shaders, but I won’t cover that here.

Types of Operators

Touchdesigner comes with a lot of operators to choose from to begin creating (right-click anywhere on the canvas and choose Add Operator to bring up the menu).

Image for post
Operator Menu — Lots to choose from :O

A lot of the initial work is learning what each one of these components does. Here are the different types:

In general you can string operators of the same types together. However, the real magic comes when you can convert between the two. Touchdesigner considers things as signals. Image can be converted to a signal. Audio can be converted to a signal. Data from an excel sheet can be converted to a signal. Points from a geometry can be converted to a signal.

For example, you can have an audio signal turned into values that can drive some type of geometry. In this example, that is what we’re going to do.

Creating the Audio-Reactive Visualizer

To start, I launch Touchdesigner and delete everything on the screen (right-click to drag and delete key).

I want to take an audio signal from my iPhone, feed it to my computer’s input, analyze it to grab the audio signal, then have that drive a set of particles that fly across the screen.

Audio Input

Touchdesigner allows you to grab audio from a number of places. A pre-recorded track, microphone input, bluetooth device, device input, etc.

Depending on what source you have, you will drop down a different CHOP.

Image for post

Since I want to stream in audio from my phone, I can add an Audio Device In CHOP (again right-click anywhere on the canvas and select CHOP > Audio Device In).

I also want the audio to play on my speaker system on my computer so I will also choose an Audio Device Out CHOP and string the two together.

After you’ve dropped these down, selected the appropriate inputs/outputs from the properties dialog, and plugged in your phone to the microphone input you can start to hear the music play.

Analyzing the Audio

The next thing to do is to analyze the audio. The most common analysis to perform is the Fast Fourier Transform (FFT). It converts a signal into individual spectral components and thereby provides frequency information about the signal.

Image for post

The frequency is the property of sound that most determines pitch and is measured in Hertz.

Luckily for us, Touchdesigner comes with a CHOP that does this for us. It’s called the Audio Spectrum CHOP. I can drop that in and connect it.

Image for post

What this represents is the frequencies in the audio stream. The parts to the left of the graph represent the Low frequency tones (like the bass). The stuff to the right represents the High frequency tones (like vocals).

There are a number of ways you can analyze the audio, but for this tutorial I want to grab the low frequency sounds (the bass of the audio) and have that drive a geometry.

Grabbing The Lows

To grab the Lows of the frequency spectrogram, you can trim only the parts that represent the lows. This can be accomplished through the use of the Trim CHOP.

  1. Right-click and choose CHOP > Trim CHOP to add it and string it to the Audio Spectrum CHOP. Modify the Trim > End parameter to choose the left 1/3 of the graph (0.333).
Image for post

Visually you will see that it cuts the left third of the audio spectrum.

The next thing you can do is to get the average frequency. This can be accomplished through the addition of a Analyze CHOP, and choosing Analyze > Average in the parameters.

Image for post

This single value then can be used to drive a geometry. Things will move in sync with this oscillating value.

You’ll notice that the signal is quite jumpy. This will make the visualization jumpy as well, and it could be very distracting. The way to smooth this, is to use a filter — a Gaussian filter.

  1. Add a Filter CHOP and choose Gaussian from the drop-down. Modify the filter width parameter to adjust the smoothing of this.

2. To the end of this I add a Math CHOP, so that I can adjust the amount of movement I want the geometry to take on (later).

Image for post

Audio Signal to Geometry Motion

Now for the fun part — driving motion of a geometry using this bass signal we’ve created. For this example, I am going to use particles.

Everyone loves particles.

The way to create some particles is by adding a source geometry (like a sphere), and attach it to a Particle SOP. By doing this, all of the vertices of the geometry are turned into spawning particles, with a certain life. To do this I do the following:

  1. Add a Sphere SOP
  2. Connect it to a Sort SOP, and choose ‘Random’. This ensures that the particles spawn randomly along the source sphere
  3. Connect this to a Particle SOP
Image for post

The particle SOP has a number of parameters to play with. For this example, I modified the ‘birth’ parameter to 200 (meaning 200 total particles will spawn per second), and added a turbulence of 3 to the X, Y , and Z directions.

Image for post

Instancing the Particles

What if instead of these flying particles, we wanted them to have a different size or shape? Or a different color? Or rotation?

To do this we can do is use a very powerful feature of Touchdesigner called instancing. This lets us create copies of a source geometry in a very specific way — and FAST. Meaning little to no lag.

The way we do this is by extracting these particle’s X, Y and Z positions, and then feeding to this to a special object called a Geometry COMP.

Image for post

Right-click the edge of the Particle SOP, and choose CHOP > SOP to CHOP.

This converts the particle positions to a channel. You will see these positions represented as 3 different channels called tx, ty, and tz.

  • Next, we add a Geometry COMP — navigate into the COMP (using the scroll wheel) and drop down a Circle SOP, and click the two bottom right circles on the Circle SOP. This video demonstrates this:

On the Geometry COMP, you’ll need to turn on instancing, drag the CHOP with the x,y,z positions to the part of the dialog that says ‘Instance CHOP/DAT/SOP’ (or just type in the name), then add the tx,ty,tz parameters to the Translate options.

Image for post

You’ll see the particles rendered as large circles. You could use any type of geometry.

Making the Particles Dance

To make it a bit more aesthetically pleasing, we can make the particles randomly vary in size and in accordance to the Bass signal we made from the audio input. This can be accomplished by using a bit of noise.

  1. Add a Noise CHOP
  2. Reference the Bass signal in the Noise CHOP’s Amplitude parameter. This varies the noise signal’s amplitude to match the audio signal. You do this by clicking on the plus sign at the bottom of the Math CHOP then dragging it to the parameter you want to refer to. The video below demonstrates this.

The next thing we’ll do is merge the x,y,z positions into a new CHOP to feed to the geometry.

  1. Add a Merge CHOP to combine the noise channel with the x,y,z, position Channel
  2. On the Geometry COMP, Modify the Instance CHOP/DAT/SOP field to point to this Merge CHOP and change the Scale X,Y, Z parameters to refer to this new noise channel.
Image for post

Now the particles dance to the music! How cool.

Adding Camera / Lighting and Rendering

What we’ve created so far is a 3d representation of the particle system, but what we want to do is render it to 2D, so that we can play it on a screen or save it to a video.

As with most 3d rendering software, you will add lighting and a camera to view the scene.

  1. Add a Light COMP
  2. Add a Camera COMP
  3. Add a Render TOP
Image for post

After adding all these, you’ll see that Touchdesigner strings them all together automagically and renders the final scene in the Render TOP. This is the final product!

Image for post

A last note is that we can use the Math CHOP we added earlier to manipulate the relative size of the dancing circles.

You do this by clicking on the Math CHOP and choosing Mult-Add and modifying the slider to suit your needs.

Final Touches

We are free to save the final render to a file, but we can spruce it up a bit. I want to add some more color to it and give it a background.

  1. Add a Ramp TOP — Modify the colors
  2. Connect this to a Lookup TOP and click the blue button to visualize it in the background.

After this, you’ll see the dancing particles given some color — Looks great!

Image for post

The final result can be visualized live, projected to a screen, screenshots can be captured, or you can save this to a movie.

Final Notes

This of course just scratches the surface of what’s possible. Some other ideas to get your brain working:

  • How about using the middle and high parts of the audio spectrum to drive a different trait of the particles or scene?
  • What about using a different signal to drive the particles? Like data or motion captured by a camera?
  • What about using other shapes and geometries?
  • What about other post-processing effects?

Hopefully this inspired you to create something in Touchdesigner!

Project File: https://github.com/cpreid2/Custom-AudioViz-Medium

Additional Learning Resources

  1. Tutorials by Matthew Ragan
  2. Free TouchDesigner book by Elburz Sorkhabi
  3. Touchdesigner Wiki

Please feel free to follow me on Instagram or follow me on Twitter if you’d like to stay in touch!

My Touchdesigner Github: https://github.com/cpreid2/Touchdesigner-Art
Generative Art: https://www.instagram.com/colinreid.me
Twitter: https://twitter.com/c0c0_re1d
Observable: https://observablehq.com/@cpreid2

Written by

Data / Visualization / Art / Programming / Biking / Photography / Chicago / T1 Diabetic

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store