When we think about music, we primarily think about an auditory experience. However, humans show the ability to translate this audio into other sensory experiences. We feel emotional connections, envision colors and textures, and might even be induced to move ourselves, or vocalize a response. Music is received by the ears, but has the potential to be a full-body, immersive sensation. So why should music be delivered to us only as sound? Can we mirror the human response and deliver music as a rich, multisensory, and interactive art form?
Here on Team BREAD, we were tasked with the following prompt:
How might we better identify individuals for their contribution to works of music?
In our preliminary user research two weeks ago, we quickly learned that most people don’t care about individual contributors. Casual music listeners listen, and enjoy a final product, but don’t have a deeper interest in knowing about the team that constructed the work. So we decided to turn the prompt around and ask,
How might we engage consumers with works of music more fully, such that they develop interest and awareness around contributors?
During a brainstorming session, we came up with the idea of a virtual reality (VR) experience where users literally see the components of a song, and we began to run with this idea, developing sample animations of how we might visually represent a song (see last week’s recap).
We came in this week excited to flush out our VR experience concept, but two meetings on Monday encouraged us to slow down and re-focus. We spoke with r. michael hendrix, who asked us, “Why VR?”. We needed to consider the native advantages and disadvantages of this technology, and how they fit with our vision.
We also met with George Howard, who reminded us to make sure we establish clear value propositions for our product. How does this VR experience add value to a user’s life? Why should they use our product, or care at all about our vision of a multisensory musical experience? And in a different vein, the ultimate goal is still to identify contributors in songs, so we need artists to join us. But why should they? How does the collaboration add value for them? We needed to more fully flush out our business strategy for the product.
With these two paths to follow, we split up into two sub-teams, Creation (Elias and Jenna) and Curation (Gabe and David).
We needed to answer the question “Why VR?”, and more specifically, we needed to understand the native advantages of the VR platform. Our logical next step was to spend more time in a VR headset. We tried out two experiences, a 3D drawing tool called Tilt Brush, and a music video called Surge.
In Tilt Brush, the player is able to draw anywhere in three-dimensional space:
This promo video seems a bit glorified upon the first watch, but after trying it out, we can confidently say that it is an accurate representation of the experience. And man, is it fun. You possess a palette for drawing, and can paint freely in three-dimensional space. The ability to create is exponentially amplified by the removal of traditional dimensional boundaries.
Additionally, there is a mode where the player can view other people’s creations. When you choose this option, you get to watch the entire drawing process from start to finish, and then admire the final product. This rich, multisensory experience is only possible through VR, and we were excited by the prospect of transferring this to music.
The Surge music video was eye-opening in an entirely different way, and encouraged us to think about the impact of scale and detail in a virtual 360 space.
Unlike Tilt Brush, the promo video hardly does the experience justice, so we highly recommend readers to check out Surge for themselves, because it is quite literally awe-inspiring. The viewer stands in the middle of a field surrounded by mesmerizing visuals of massive size. We realized that VR dramatically amplifies the power of scale, and if utilized correctly, this technology would enable us to create mind-blowingly novel experiences around music.
So let’s revisit Team Creation’s initial question, “Why VR?”
Remember that one of our primary challenges was that the average person simply does not care about the multitude of contributors within a given song. They only experience the artistic work as audio, and this one-dimensional medium does not effectively communicate the complexity of music, or the effort that went into its creation.
While brainstorming, we decided that the fluidity of music was something we wanted to convey in our product. In virtual reality we can be surrounded by music, and let it take us places. We think this is vital to the experience, as it allows us to be fully immersed. Additionally, we do not want users to be hindered by learning how to play (however simple the controls may be). We want the exploration to be naturally physical. With VR there is an opportunity to communicate richness and complexity in ways that were never before possible. We want our users to be able to literally walk inside a song and touch the components, maybe take a piece with them, and build connections between works by carrying them to another song. This experience is most compelling through VR, and therefore we are confident that it is the right technology to pursue at this time.
With this fresh perspective, Team Creation got to work on creating prototypes to capture specific interactions in our envisioned VR experience.
We used a game design coding language called Unity to create a simple environment modeling how a user might interact with music in a VR space. Through a 1st-person camera, the player can walk around between different bundles of bubbles. Each bundle is a different song, and each bubble represents a different group within the musical arrangement (stem). When the player walks near a bundle, the song starts to play, but the auditory experience is affected by the player’s proximity to a bubble: as you walk towards one bubble, the stem increases in volume relative to the rest of the arrangement, and as you walk away, the stem’s relative volume decreases. A user can explore the different stems within a song by walking closer or farther to different bubbles.
Here is a video of the experience. We encourage you to use headphones while watching to experience the full effect.
We felt that our vision for the experience would be best communicated through, well, visuals (intuitive, ain’t it?), so we built a series of animation sequences using Cinema 4D, a 3D design software.
Each pillar represents a different song, and the user can explore the environment by walking between pillars.
When you walk inside a song, you can see the music visually represented as bubbles, just like they were in the Stem Room demo.
We also plan to animate the bubbles to dance around with the music, and the video shows what that motion might look like.
Our technical prototypes this week are relatively simple mockups, but are excellent starting points for building a foundation and gathering feedback. Tell us in the comments what you think!
This team set out to better understand how our product could add value for both consumers and clients. We decided to start by interviewing people, asking how they find new music and what about a new song draws them in, or pushes them away. We wanted to understand how we could draw people into a song within the VR environment, and once inside a song, how we could engage a user without overwhelming them. We spoke to four individuals through personal connections.
All four of our interviewees were reasonably experienced musicians, and self-identified as musicians. Ultimately we want our product to connect with a broader audience, and we need to interview a more representative sample to make firm conclusions. However, in IDEO’s design methodology, extreme perspectives are important in shaping a proper middle-ground, so it was an appropriate starting point.
Every interviewee was in agreement that they could pass judgement on a song within seconds. This screening process primarily was centered around categorizing new songs into known style boxes. They mentioned features such as era, genre, prominence of melody, and overall energy. Ultimately, they were all interested in feeling something from listening to music. Each individual’s preference varied; one wanted to feel happy, another wanted to feel relaxed, and one said they liked music, “If it makes me do this — [participant bounces up and down vigorously while humming]”.
These interviews showed just how complex the experience of listening to music can be. Our participants talked about music engaging their mood, emotion, memories, and physicality. We determined that our VR product has the potential to add immense value to our users’ lives by directly engaging with non-auditory senses.
We used the Business Model Canvas to lay out a more in-depth plan for our product.
We felt that by imagining it as a fully-fledged business with distribution, sales, and marketing, we could direct our thinking towards important considerations around value and reach. We researched the VR market and who some of our target demographics might be. We discussed how our product would serve consumers, where we might sell the product, and the potential for collaboration with the music industry.
We arrived at two major decisions from this exercise. First, we agreed that we wanted the product to be accessible not just to music fanatics or hardcore gamers, but also to everyone in between. We want anyone who puts on a VR headset to have a great time playing in our VR environment, and want to keep playing today, the next day, and so on.
This shaped our second decision, a firm value proposition. Our value prop is twofold, partly for consumers and partly for artists who might contribute work to the experience.
Our consumer value prop is as follows:
Immerse yourself in music
Music is the toy, and you are the player. See, touch, hold, and play with music in an immersive virtual reality environment.
Step inside a song and watch it come to life around you
Reach out and touch the sound, or grab hold and take a piece with you
Discover connections between artists, sounds, and musical styles
We want to highlight the immersive, interactive features of the product. We believe that if we can engage all of the senses available in VR, we can engage the broadest swath of consumers, because everyone can find a way to enjoy the experience regardless of what their relationship is with music or traditional video games. Additionally, we believe that this multi-sensory approach will make for the most interactive and enjoyable experience possible.
Our value prop for artists is similar, but directed more to the needs of an artist:
Deliver your art in an immersive experience
Can you imagine your music as a full sensory experience? Not just a sound, but also a living, breathing, moving object that can be seen and held. Your art is a complicated, layered product and we want to harness that in our virtual reality game.
Engage new fans through our innovative multidimensional platform
Tell your story through quests that highlight the creative process
The two things we can offer an artist are money and exposure. We will, of course, pay appropriate licensing for any music, but this will be a comparatively nominal benefit. Our main pitch is that, by presenting their art in this multidimensional form, we can engage a broader set of fans and build deeper connections to their work. Additionally, the connections will reach all of the individual contributors, not just the billed artist. This will grow and strengthen their fan base, which will have a much more lasting impact on their career than a one-time licensing payment.
Since last week, our product has advanced from a hazy storyboard and simple animations to an interactive environment, and we as a team have developed a much better plan for how to grow and develop our concept. However, the work is hardly complete.
We will be conducting more interviews on all of our technical prototypes to understand how we can continue to enhance the virtual experience in ways that are defined by actual users. We are aiming to interview a less musical sample to fill in the other extreme on the music listening scale.
We will be testing all of our new technical prototypes in an actual VR environment. Putting the headset on was an incredibly valuable experience for our team, so we will continue this practice on our own prototypes. We will also be exploring ways to further engage players in the environment; the Stem Room prototype is a great foundation for adding more interactivity, and we will be using findings from VR tests and user interviews to build out a more fun experience. And finally, we will be conducting research into game design principles to better understand how we can create an experience that is appropriately interactive, highly engaging, and has a high replay value.