Scoring The Exhibition Episode One

Jordan Bloemen
The Exhibition
Published in
5 min readMar 13, 2015

--

Getting fast and messy with music, dialogue, and sweet robot sounds.

One of our actors noted after seeing the episode for the first time “the soundtrack reminded of that movie, the one with the Daft Punk score — “ I offered up hopefully “Tron?” The comparison is not only fair, but invited — Daft Punk tapped into sounds usually reserved exclusively for dance music and cinematic scores, and whether the score stuck with you after the film, the initial enthusiasm for the launch trailer with the blaring horns layered over tumbling arpeggiated synths illustrated a point: these are two musical languages that are entrenched enough culturally to have use beyond their novelty. There is a place for sci-fi inspired synths in a cinematic score, beyond the “ooh look, we’re in space” cheese factor of old sci-fi scores.

I cared a great deal about that in scoring the pilot episode, because end of the day — synths are a lot cheaper than a symphony, and a lot easier to fake in a computer. I’m going to talk about that process in this post, the basic tools I used in composing the score, some challenges, and how sound design stitched the whole thing together.

The sound of the episode was made up three major parts: dialogue, music, and background. I lapse in an out of recognizing obscure terms and defining them for a laymen, and assuming anyone reading knows what I’m talking about. Sorry in advance for that.

As for music, Logic Pro X was an incredibly powerful, useful tool that I’ll be talking about a bunch. The makers have gone to great lengths to make working in MIDI (a kind of digital musical notation that gives directions to a digital instrument as to what notes to play and when) incredibly easy. Honestly, the automation structure lets you make the most minute little second-to-second adjustments so easily. I stumbled into this super well-worn trick from EDM (something I hadn’t played with in years) of introducing every track through a slow fade not with volume, but by the frequency cutoff on a lowpass filter. The effect is that sort of tense, low rumble, that slowly swells into clarity over time. It’s such a satisfying way to introduce an element or sound, and it became a powerful dramatic tool — an idea is dawning on the character as the musical accompaniment dawns on the audience. If the music is an external expression of the characters mood, it’s a much more natural way of introducing that to the audience than just dealing up volume.

The score was made of a giant hodgepodge of different synthesizers, artificial orchestral instruments and drum machines, but if you told me — “make this thing again, but you only get to use two things,” it would be the synthesizer Massive — specifically an arpeggiated bass synth — and a great big goddam timpani. I am firmly of belief that’s nothing you can’t make cool with an arpeggiated bass synth (thx soundtrack from the movie Drive), and there’s nothing you can’t make dramatic with a timpani (orchestral drum).

Back to the low-pass filter, structurally just about every track in the score followed a similar progression. Introduce the percussive, rhythmic element (typically a rising synth) via lowpass, match sound peak to dramatic peak, and either sustain until turning point or immediately let crescendo taper off. Here’s an example:

You’ll note the dissonant triplets — which (despite the fact the twilight zone rolled with four notes) gives it a Twilight vibe under the swelling synths. I guess the third instrument would be a xylophone. I love those things. The only track with a significant emotional turn was in a deleted scene. Otherwise it’s all rise and fall.

Shimmying on over the background sounds, creating the diagetic audio of the pilot was a strange exercise in improvisation. The sound of a robot falling down an elevator shaft was me throwing two stools at each other. Astro hitting the wall after being pushed was just a giant drum hit made sprawling and bassy via stereo-spreads and weird EQ’ing. Cutting through metal in the intro was like, sixteen different sounds layered over each other.

My favourite was the sound of the robot Biv. Biv is our little R2D2-esque character, and the animator did a lovely job giving him a totally unique style of motion that really defined the character. The big goal for sound was to draw the eye back to Biv at moments when he was doing something cute and fun we wanted the audience to notice.

His sound was created by piping a microphone signal through autotune dialled up beyond Kanye-esque “creative texture” levels, then taking that autotune signal and piping it through a vocoder. Then just sort of mumbling/chirping emotive sounds that mirror what Biv would be feeling; a squeal as he falls down an elevator shaft, a grunt as he hits the ground, a cough as he stands up. Accompanied with a Massive synth-tone with all the amps dialled down, and playing with cutoff (you get the static/chirp/squeal), you’ve got Biv.

So we come to dialogue, the point at which Logic nearly failed me. Logic ostensibly has all of the tools necessary to handle dialogue well, and I don’t doubt some of my troubles came down to a workflow not suited for working with dialogue — but damn, Logic really didn’t want to cooperate with me. We did three, roughly one hour sessions with each of the actors respectively, which meant there was a lot of stuff to keep organized. I keep thinking about it, and I still genuinely don’t know how to organize that kind of content in a DAW. They’re just not built for it, at least in one file. The other big issue was scrubbing — moving around an audio file with a live preview playing, so you can find a specific moment a sound starts or stops or changes. There was a point where we had two people, both who would identify as being competent at learning and understanding the finicky inner-workings of creative software, scouring blogs and sites to find out how to do live scrubbing. Every instruction, all strange, counterintuitive paths to success, failed. It felt like I was using some weird beta of Logic that had moved around some setting, which I’m almost certain I’m wasn’t.

Ultimately, I’d use Logic again. Maybe I would swap it out for dialogue for something like Audition, or even premiere, at least in terms of arranging the big messy recording file. For everything else, it’s a powerful, surprisingly light running tool that made life easier. A big plus: Logic’s built in instruments are surprisingly useful. Not all of them, but a surprising amount of stock sounds snuck into the final product. The pianos especially. Yamaha Piano Setting #5 my dearest, I do love you so.

The Exhibition Episode One is a webseries pilot produced for Telus Storyhive. It will be released on March 16.

--

--

Jordan Bloemen
The Exhibition

Creative with Sticks & Stones Communications. Editor of YEG Guide and Profile Magazine. Co-Founder of Mischief Managed Theatre Company.