Designing Soundtracks for Text

What if sound synced with your reading material?

[update: check out part two here]

If I were to embed a song file into a body of text, the media would have a play button and it would be self contained. (illustration #1) The experiences for media versus text would be roughly parallel, (illustration #2) but they’d actually be two separate things. The audio would start, reach the end, then it would stop. The song doesn’t know or care if I’ve read five words or five hundred, so there are no interactivity options. Parents sometimes refer to this as “parallel play” when two children are play near each other but not actually communicating in any way.

What if there was a way to make the content more synchronized? Software for composing music typically looks like a spreadsheet where each instrument is a row and the columns represent time. (illustration #4) Couldn’t each instrument be broken into its own loop, and then told to start when a scroll location has been reached?

For example, say you read the first page of a story without any music. But at a certain point, a quiet string section could subtly fade in. Maybe a few paragraphs later, a crowd noise loop would begin because the main character has gone to a bustling city market. And so on.

Help Wanted

I like this idea a lot, and I’ve sketched out how it could work. I’ve also composed a bunch of songs I’d like to experiment with. Even better, it turns out this is technically feasible. Web browsers can keep track of your scroll location and something called the Web Audio API lets you play loops. You can even make sure everything is synchronized so your song sounds are on time and not a cacophony of overlapping sounds with no shared beat. Which is awesome.

But I can’t get the code to work. (Though I did get a bunch of sounds fighting with each other. It wasn’t what I was going for, but it did make me chuckle.) And I think this moment is where a lot of designers struggle, if they even get this far.

Traditional graphic artists could take their work from concept to delivery because they had all the skills and tools they needed. And engineers from 1986 to 2016 have been in the same boat — if they could imagine it, they had the tools to try and build it. Alone.

Whereas software designers will always need to beg for resources to realize their ideas unless they learn to write code. This might be ok in a big corporation where engineering teams and design teams are assigned to work together. But with side projects like these, a designer is like a screenwriter that wants to make a movie. I can only get to certain point before I need other people to help me make it a reality.

So: are you good with JavaScript and the Web Audio API? I’d love to hear from you. Email me at jon@lot23.com and let’s collaborate!