Seeing Sound Part 2: Building Bridges

Alec Petros
6 min readMar 20, 2018

--

Although I’ve spent my time in school and out of it studying sound, I’ve always had an unhealthy fascination with lights. I didn’t get much hands-on experience with theater lighting aside from routine light hangs, focuses, and strikes, but lighting design was always a mystical world of color and flash that held a certain appeal for me. I fantasized about designing lights and visuals for my own inevitable world tour, once I was famous and closing out arenas (still waiting on that).

actual footage of my next concert

To date, though, I’ve only managed to play in tiny bars with no more than a couple gelled source-4’s pointed at seemingly random directions around the stage. Not exactly the vibrant audiovisual barrage I imagined.

But everything changed when I learned how awesome Processing could be.

After my last blog post, I started eyeing the corpse of an old project — an LED matrix made with Neopixels that I’d strapped to the ceiling in my room and used as overly elaborate mood lighting. It was an experiment I took on way before I had any idea what I was doing with code, and I stumbled through the Processing side of things by chopping together bits of other peoples’ code, and had it running for a grand total of a week before I melted the LED strips by running it off a power supply with way too much voltage. Since then, the 5-foot strips of LED haven’t been good for much more than my cat’s entertainment.

Suddenly, with a newfound handle on Processing and a lesson well learned about how electricity works, it seemed like a great time for a more functional encore of the LED matrix.

Neopixels, by Adafruit

Neopixels are a brand-name version of WS2812B LED strips, extremely bright and colorful little buggers that can each be individually addressed by a controller — meaning unlike a lot of LED mood-lighting strips, each one can be told to display a different color, like a TV screen or computer monitor. To drive them, I’ve picked up a Fadecandy, a tiny microcontroller (limited mini computer) designed specifically to drive LEDs. It handles the bulk of the image processing and layers on some extra icing like smoothing out fades — LEDs don’t like fading smoothly to black — and making colors pop. Awesome as the Fadecandy is, though, it has a one-track mind, and it doesn’t have room to think about receiving images, just talking to the LEDs.

I still haven’t worked out cable management

For the heavy lifting, I’ve plugged the Fadecandy into a Raspberry Pi, a wonderful little customizable credit card sized computer. Once I got it up and running, I installed the Fadecandy Server, prepped the Pi for communication over a network, and told it to run a Fadecandy Server whenever it boots. Ideally, this means I’ll never have to hook it up to a monitor and keyboard again, and I can handle tweaks and edits from any other computer on my network.

In all its glory

Next came several hours of soldering. Each of the Fadecandy’s 8 channels can handle up to 64 LEDs. I chose to connect two strips of 30 to each of four channels, for a total of 8 rows of 30 pixels each. Part of the Fadecandy’s magic is that you can tell it, via FCServer’s settings and a Processing library, how your matrix is arranged, so even though each channel is 60 pixels split in two running up and back down the mounting board, it can correctly address each pixel’s location. Finally, I mounted all eight strips on a 4' plank of plywood, leaving some room on the far side to mount the Fadecandy, Raspberry Pi, and some power busses to fuel it all.

The magic block of code that makes the matrix work.

Finally, it was time for the fun stuff. OPC (Open Pixel Control) is a library for Processing that handles orienting the pixels and transmitting — wirelessly! — to the Pi. After working out the block of code to define where each strip should live on the screen, I had a portable initializer for my LED matrix that I could drop into any sketch I wanted. To test, I fired up my visualizer sketch from the last blog — gloriously, impossibly, the board lit up like a Bon Jovi concert. My tiny Brooklyn apartment could barely handle the amount of illumination spilling out of it. I like to imagine that it looked something like the stereotypical mad scientist’s laboratory, with random flashes of light spilling out from the windows and underneath the door.

As cool as the visualizer looked, though, the finer details and small rings didn’t translate well to the extremely low-resolution screen I’d created. Additionally, it wasn’t very flexible. I needed to take it further, make it practical and modular, somehow. After some furious googling, I came across Syphon, a lovely piece of software that routes video between applications on your Mac, much like Loopback, the audio-routing software I’ve been using for the visualizer. After loading in the Syphon library, I could stream video from Resolume, a professional VJ (video jockey) application I picked up during my misguided attempts to do concert visuals in college.

Syphon turned out to be remarkably easy to work with. The Processing code required is pretty basic — the only stumbling block was matching framerates. Over time, the framerate of Resolume would desynchronize with the refresh rate of Processing, leading to some extreme flickering that looked pretty ugly on the pixels. Telling both Resolume and Processing to operate at 30 frames has largely smoothed that out, though, and now I have a lovely, beat-syncable, wireless, gigantic wall of lights that I can bring with me when I finally start my world tour.

What’s next? The more I use Resolume, the more I realize it’s a limiting factor to how crazy I can get with the lights. It tempo matching is limited at best, and for someone used to music sequencers, which are extremely flexible and well-integrated, it just couldn’t get me where I wanted. Luckily, I already have a fantastic piece of software that can send detailed signals digitally — Ableton Live. Since Processing can take MIDI signals, I should be able to tell it to listen for MIDI routed out of Ableton, and then use Ableton’s incredibly fluid sequencing and looping functions to create complex visuals on the fly. The catch is I’m going to have to make a monster of a Processing sketch to get all the functions I see in my head, out onto the matrix.

With any luck, my next blog post will be the glorious and flashy conclusion of this three-part journey into the wonderful world of Processing.

--

--