How we created a 400 player space game powered by screams
That was our 4-word brief but it was all we needed. If you haven’t been to an annual meeting before, they are known for lengthy powerpoints filled with updates from the finance department, super inside jokes between the leadership team, and pats on the back for another year of success. In some circles they may also be referred to as “Death by Powerpoint”. We decided our hack needed to break up the monotony without feeling like forced fun. As an experience design practice we wanted our part to be interactive as well as a collaborative effort where everyone in attendance could leave the meeting feeling as though they played an integral role. Lastly, we knew this would be our parent company’s first foray into something other than a traditional annual meeting, so we thought it imperative to avoid overwhelming our audience. No one wants to go from zero to one hundred (real quick) when they are expecting to sit back and watch slides all day. Once we had our constraints, it was time figure out what we could make.
Through the lens of breaking up a long day, we settled on creating “moments” in between annual meeting presenters when the audience would have the opportunity to rest their brains. These moments would happen at different times in the day, possibly hours apart, so they had to have a common thread to connect them all. Luckily, we were operating within an event that had a clear beginning, middle, and end. So we wrote a story that fit right in, not only providing moments of relief but also driving the meeting flow. Because our first moment would occur at the start of the meeting, we wanted to create excitement. Since we couldn’t bring in a live marching band to kick it off (that happened last year) we took some inspiration from our mascot, the TWA moonliner. Our first moment would be a thrilling rocket launch.
To ensure company-wide collaboration, we decided the launch would take the form of a theatre-sized game powered by the entire crowd. With access to a projector screen and a captive audience, we had a way to play and integrate it seamlessly into the presentations. After starting the show with a rocket launch, we’d follow the rocket on a journey through space, symbolizing the midpoint, and end with landing the rocket to signal that the hard part is over and it’s time for a dance party.
We wanted to avoid making any more work for our participants, so we chose to use an input that the audience could supply without much effort. Luckily, our creative technologist had recently been playing with sound input as a control for web-based games. We asked him “Can we apply that to an audience of 400 people?” he said “ ¯\_(ツ)_/¯”. And it was on. With our functionality in place we wanted to up the ante. Not only would we use sound as an input, we would vary how the sound is delivered and how it controls the game. Yelling and clapping would be our first input to make the rocket takeoff. Basic noise-making techniques would allow our users to get comfortable with the idea of using sound as a control and prepare them for what’s to come. In the next game we would raise the difficulty by splitting the audience in two, controlling the left and right movements of the rocket through space. Each side would have to be strategic about their screaming in order to avoid obstacles. In the third and final act, we added a whole new layer of specific sounds: Hand claps, “boom”, and “woo”. Executing each sound at the right time would be key, bringing our audience back in unison to end the meeting.
In the Making
Once we had our games, we let the meeting’s theme, “In the making”, influence the aesthetics. We were particularly inspired by some prose our Design Director used to describe Barkley’s ethos:
We are a work in progress.
We are a delightful collection
Of nuts and bolts,
quirks and queries,
sketches, plans and notes
scribbled in the margins.
Our future isn’t set
because the future isn’t static.
It shifts, pivots, deconstructs and reassembles
before our eyes.
And so do we.
We are a work in progress. We make stuff, break stuff, then make more stuff. So we appointed our primary making tools as characters in the games. Where the tools live would be our landscapes. Then we’d treat it as one big prototype where no character or backdrop is a static fixture. The end result was three distinctly different worlds with noteworthy transitions between them, starting with a familiar office desktop and ending with a completely handmade planet of felt and paper. The UI was kept to a bare minimum, providing just enough feedback for the audience to understand how to adjust their noise making.
How it works
Translating a laptop game into a theatre-sized experience would be a challenge to anyone, but it was especially difficult with a short timeline and limited resources. Even though our first prototype functioned in essentially the same way as our opening game we still had to learn how games 2 and 3 would work.
Game 2 required processing audio from microphones individually then using the left and right channel data to control the direction of the rocket. Traditional audio hardware emits sound data across tracks instead of channels. This was problematic because the Web Audio API does not yet support the concept of tracks. To overcome this issue, 2 AudioBox USB devices were purchased, an audio listening device for the browser was built, and sound data was parsed via web socket. Each listening application ran in a different instance of a web browser (eg. Chrome and Chrome Canary) because WebKit only allows one device to be consumed by a single browser. Moving data via a throttled socket was significantly slower than handling directly in a browser, but the transmission was plenty fast enough for our needs.
Game 3 relied on unified crowd participation in the form of yelling “boom”, “woo”, or clapping their hands. To detect these sounds, extraneous noise must first be eliminated. Since humans normally produce vocal tones between 300Hz and 3000Hz but the microphone data comes in a much broader range we were able to improve the accuracy of tone detection by only measuring the desired frequency bands.
Testing is always an integral part of our process, but gathering a crowd of 400 hundred for a dry run would be nearly impossible with our timeline. We did what we could with significantly smaller audiences but to compensate for any unforeseen issues, we created a live configuration interface which allowed various sound properties to be adjusted in real-time. With a remote machine we were able to toggle volume thresholds, microphone sensitivity, beats per minute, and volume/frequency monitors during gameplay.
Do it Live!
When it came time for the show we were on shaky ground. Since we only had time for a couple of trial runs, the only thing we could do was make sure it ran on the theatre equipment and cross our fingers. And it worked! Kind of… We ran into some technical glitches because the old school hardware we were obligated to use couldn’t keep up with our game. Even with the problems, it still got the crowd going. The most surprising feedback was that people didn’t think it was real. No, really, 400 people were simultaneously controlling what was happening on the screen!
Who knows, maybe next time you play this it will be on a stadium big screen cheering on your favorite team…