When I first downloaded iTunes a long time ago, I would spend hours transfixed by the visualizations they provided along with the software primarily used to organize and play my MP3’s.
It was the first time I saw music I loved represented as an intriguing visual form. I loved it.
I’m not sure what software tools they used to make these visuals, but what I can tell you is that creating your own custom audio-reactive visualization is dead-simple now, using a program that has become one of my favorites to work with: Touchdesigner.
Touchdesigner is an incredible visual coding platform that allows you to create stunning real-time projects and rich user experiences.
I love it because it is:
- Visual and node-based
- GPU Accelerated
- Works with Python3 for more customization
- Runs in real time (refreshes every 60fps)
I’ve used it for creating generative art, 3d graphics, audio-reactive visualizations, data visualizations, and much more. See some of my work here.
Creating beautiful 3d graphics is easy as dragging and dropping.
What you need for this project
- The latest build of Touchdesigner
- A computer with a decent graphics card
- Audio — This can be an mp3 downloaded from the web or you could just use your phone’s audio as an input (i.e. a song from spotify or apple music)
A General Touchdesigner Workflow
- Open up Touchdesigner
- Right-click to add components / nodes
- Modify parameters in the properties box
- Wire them together by clicking the edges of the nodes
- Click on the blue circle in the box you want to visualize or save the results to an image file or movie
It’s really this simple ^
Of course you can build more complex things, using custom python code or GLSL shaders, but I won’t cover that here.
Types of Operators
Touchdesigner comes with a lot of operators to choose from to begin creating (right-click anywhere on the canvas and choose Add Operator to bring up the menu).
A lot of the initial work is learning what each one of these components does. Here are the different types:
- COMPs — Components — Object components (3D objects), Panel components (2D UI gadgets), and miscellaneous components. Components contain other operators.
- TOPs — Texture Operators — all 2D image operations.
- CHOPs — Channel Operators — motion, audio, animation, control signals.
- SOPs — Surface Operators — 3D points, polygons and other 3D “primitives”.
- DATs — Data Operators — ASCII text as plain text, scripts, XML, or organized in tables of cells.
- MATs — Material Operators — materials and shaders.
In general you can string operators of the same types together. However, the real magic comes when you can convert between the two. Touchdesigner considers things as signals. Image can be converted to a signal. Audio can be converted to a signal. Data from an excel sheet can be converted to a signal. Points from a geometry can be converted to a signal.
For example, you can have an audio signal turned into values that can drive some type of geometry. In this example, that is what we’re going to do.
Creating the Audio-Reactive Visualizer
To start, I launch Touchdesigner and delete everything on the screen (right-click to drag and delete key).
I want to take an audio signal from my iPhone, feed it to my computer’s input, analyze it to grab the audio signal, then have that drive a set of particles that fly across the screen.
Touchdesigner allows you to grab audio from a number of places. A pre-recorded track, microphone input, bluetooth device, device input, etc.
Depending on what source you have, you will drop down a different CHOP.
Since I want to stream in audio from my phone, I can add an Audio Device In CHOP (again right-click anywhere on the canvas and select CHOP > Audio Device In).
I also want the audio to play on my speaker system on my computer so I will also choose an Audio Device Out CHOP and string the two together.
After you’ve dropped these down, selected the appropriate inputs/outputs from the properties dialog, and plugged in your phone to the microphone input you can start to hear the music play.
Analyzing the Audio
The next thing to do is to analyze the audio. The most common analysis to perform is the Fast Fourier Transform (FFT). It converts a signal into individual spectral components and thereby provides frequency information about the signal.
The frequency is the property of sound that most determines pitch and is measured in Hertz.
Luckily for us, Touchdesigner comes with a CHOP that does this for us. It’s called the Audio Spectrum CHOP. I can drop that in and connect it.
What this represents is the frequencies in the audio stream. The parts to the left of the graph represent the Low frequency tones (like the bass). The stuff to the right represents the High frequency tones (like vocals).
There are a number of ways you can analyze the audio, but for this tutorial I want to grab the low frequency sounds (the bass of the audio) and have that drive a geometry.
Grabbing The Lows
To grab the Lows of the frequency spectrogram, you can trim only the parts that represent the lows. This can be accomplished through the use of the Trim CHOP.
- Right-click and choose CHOP > Trim CHOP to add it and string it to the Audio Spectrum CHOP. Modify the Trim > End parameter to choose the left 1/3 of the graph (0.333).
Visually you will see that it cuts the left third of the audio spectrum.
The next thing you can do is to get the average frequency. This can be accomplished through the addition of a Analyze CHOP, and choosing Analyze > Average in the parameters.
This single value then can be used to drive a geometry. Things will move in sync with this oscillating value.
You’ll notice that the signal is quite jumpy. This will make the visualization jumpy as well, and it could be very distracting. The way to smooth this, is to use a filter — a Gaussian filter.
- Add a Filter CHOP and choose Gaussian from the drop-down. Modify the filter width parameter to adjust the smoothing of this.
2. To the end of this I add a Math CHOP, so that I can adjust the amount of movement I want the geometry to take on (later).
Audio Signal to Geometry Motion
Now for the fun part — driving motion of a geometry using this bass signal we’ve created. For this example, I am going to use particles.
Everyone loves particles.
The way to create some particles is by adding a source geometry (like a sphere), and attach it to a Particle SOP. By doing this, all of the vertices of the geometry are turned into spawning particles, with a certain life. To do this I do the following:
- Add a Sphere SOP
- Connect it to a Sort SOP, and choose ‘Random’. This ensures that the particles spawn randomly along the source sphere
- Connect this to a Particle SOP
The particle SOP has a number of parameters to play with. For this example, I modified the ‘birth’ parameter to 200 (meaning 200 total particles will spawn per second), and added a turbulence of 3 to the X, Y , and Z directions.
Instancing the Particles
What if instead of these flying particles, we wanted them to have a different size or shape? Or a different color? Or rotation?
To do this we can do is use a very powerful feature of Touchdesigner called instancing. This lets us create copies of a source geometry in a very specific way — and FAST. Meaning little to no lag.
The way we do this is by extracting these particle’s X, Y and Z positions, and then feeding to this to a special object called a Geometry COMP.
Right-click the edge of the Particle SOP, and choose CHOP > SOP to CHOP.
This converts the particle positions to a channel. You will see these positions represented as 3 different channels called tx, ty, and tz.
- Next, we add a Geometry COMP — navigate into the COMP (using the scroll wheel) and drop down a Circle SOP, and click the two bottom right circles on the Circle SOP. This video demonstrates this:
On the Geometry COMP, you’ll need to turn on instancing, drag the CHOP with the x,y,z positions to the part of the dialog that says ‘Instance CHOP/DAT/SOP’ (or just type in the name), then add the tx,ty,tz parameters to the Translate options.
You’ll see the particles rendered as large circles. You could use any type of geometry.
Making the Particles Dance
To make it a bit more aesthetically pleasing, we can make the particles randomly vary in size and in accordance to the Bass signal we made from the audio input. This can be accomplished by using a bit of noise.
- Add a Noise CHOP
- Reference the Bass signal in the Noise CHOP’s Amplitude parameter. This varies the noise signal’s amplitude to match the audio signal. You do this by clicking on the plus sign at the bottom of the Math CHOP then dragging it to the parameter you want to refer to. The video below demonstrates this.
The next thing we’ll do is merge the x,y,z positions into a new CHOP to feed to the geometry.
- Add a Merge CHOP to combine the noise channel with the x,y,z, position Channel
- On the Geometry COMP, Modify the Instance CHOP/DAT/SOP field to point to this Merge CHOP and change the Scale X,Y, Z parameters to refer to this new noise channel.
Now the particles dance to the music! How cool.
Adding Camera / Lighting and Rendering
What we’ve created so far is a 3d representation of the particle system, but what we want to do is render it to 2D, so that we can play it on a screen or save it to a video.
As with most 3d rendering software, you will add lighting and a camera to view the scene.
- Add a Light COMP
- Add a Camera COMP
- Add a Render TOP
After adding all these, you’ll see that Touchdesigner strings them all together automagically and renders the final scene in the Render TOP. This is the final product!
A last note is that we can use the Math CHOP we added earlier to manipulate the relative size of the dancing circles.
You do this by clicking on the Math CHOP and choosing Mult-Add and modifying the slider to suit your needs.
We are free to save the final render to a file, but we can spruce it up a bit. I want to add some more color to it and give it a background.
- Add a Ramp TOP — Modify the colors
- Connect this to a Lookup TOP and click the blue button to visualize it in the background.
After this, you’ll see the dancing particles given some color — Looks great!
The final result can be visualized live, projected to a screen, screenshots can be captured, or you can save this to a movie.
This of course just scratches the surface of what’s possible. Some other ideas to get your brain working:
- How about using the middle and high parts of the audio spectrum to drive a different trait of the particles or scene?
- What about using a different signal to drive the particles? Like data or motion captured by a camera?
- What about using other shapes and geometries?
- What about other post-processing effects?
Hopefully this inspired you to create something in Touchdesigner!
Project File: https://github.com/cpreid2/Custom-AudioViz-Medium
Additional Learning Resources
My Touchdesigner Github: https://github.com/cpreid2/Touchdesigner-Art
Generative Art: https://www.instagram.com/colinreid.me