Being There, AGAIN.
Have you ever been at a concert and lost yourself to the music? Live music allows for concertgoers to access a state of flow in an effortless fashion. Our team is currently focusing on concepts around sociability that helps define a live experience state of flow.
How might we facilitate social behavior around communal live music experiences?
This past week, we prototyped the idea of a dynamic space that automatically curates music to those within the space. This application, LÜM, would aggregate music from each audience member’s personal music taste, creating a collective playlist of tracks, and play out an emotionally and artistically compelling live mix. The application would simultaneously catalogue tracks into a playlist that audience members can refer back to after leaving the space. There are two assumptions we considered for this prototype to work:
- An accurate representation of your personal musical taste is stored within the services and platforms that you consume music.
- Spaces designed to facilitate social functions tend to gather people with similar mindsets (similar musical tastes).
We laid out the necessary tools to fully realize this prototype:
- Artificial narrow intelligence to make smart music recommendations
- Machine learning of DJ techniques to create a comparable experience to a human DJ
- OMI API for cataloguing tracks
- Spotify Web API for aggregating musical taste of the audience members
The value in this prototype would come from the participating audience members in which their personality is reflected within the mix being played — the effect of Harry Potter’s Mirror of Erised. For someone to enter a space and have their favorite music automatically played, as well as hearing favorite cuts of those around you creates a social platform that expedites engagement from one likemind to another.
We made a simple Spotify web app to display the basic functions of the prototype. After signing onto their Spotify account, the app’s backend will pull out the user’s most recently played 5 tracks and then give 20 recommendation tracks based on the 5 tracks collectively. In taking this prototype further, the app will allow multiples users to sign in and give recommendation tracks based on their recently played tracks, their most played tracks, carefully selected songs, etc.
We created a mockup of the graphic user interface of LÜM that displays a basic, entry-level experience of entering the space through an Apple wallet ticket. When the user is within the app, LÜM will then aggregate the music taste of the user and push that data to the automated DJ once they step into the geofence (e.g. a dancefloor).
Questions and Concerns
The prototype allowed for us to contemplate various questions about the artistry of live performance. In showing this prototype, we found that people would not trust an autonomous DJ for creating a compelling live experience, and because we are simply substituting a live DJ with a computer, this experience doesn’t line up with a most people’s definition of a live performance. However, we did find that people were interested in the social experimentation that this prototype would provoke.
We also questioned whether the mixes this prototype produces can legally exist in the digital music market. What we found was that recordings of mixes cannot be distributed after the fact, as copyright law would act upon this work due to artists within the mix holding the exclusive right creating derivative works. This is the fundamental problem in distributing DJ mixes, as the essence of DJing is to create a new musical experience using preexisting works. However, the actual mixes themselves, as “performed” by the algorithm in the space, would come under public performance rights and would not raise any legal issues so long as the venue has public performance licenses for the material used.
Another concern raised through our prototype testing had more to do with our process. Our conversations with a few expert designers illustrated to us that we needed to spend more time going out and showing people and iterating based on their feedback. With our initial prototype, we had focused on illustrating the concept, but not on answering questions as to our intended user.
Based on the conversations in the past week, we are planning to iterate on the initial concept to focus on enriching the interactions and relationships between audience members. We plan to explore values of different stakeholders included in the interactions between concertgoers, whether it be artists, artist management, the venue owners, etc. Additionally, we plan to have many more real-world conversations with those stakeholders in order to be better informed in our concept, and incorporate more extreme use cases to better answer the underlying design questions we have.