Real-Time Music Generation in Unity3D

Unity3D Game with Real-Time Music Generation

Imagine yourself playing your favorite game, and you’re in the middle of a pivotal gameplay sequence. Imagine you’re this close to beating the game but keep running into that pesky boss who keeps blowing you to smithereens.

Now imagine that same game without the background music. Does it feel the same? Do you feel that same rush of adrenaline? Does it have the same thrill? I bet it wouldn’t.

Music may be an afterthought for the average gamer, but it is one of the most important aspects of game development and success. It enhances the gameplay experience by keying in on the emotions of the player and shaping their perception of the game world. Oftentimes, music is what makes a horror game legitimately scary or an action game truly heart-pounding. Yet the process has become clichéd, almost always consisting of small bits of music called “cues” that are constantly looped and cross-faded. While this can create emotional, meaningful connections with the player in the short-term, sooner or later the music will become repetitive and dull the more time a player spends in a game level. When music for games is written in this fashion, it is connected to the game’s story but disconnected from the user experience, as it fails to adapt to the player’s actions and decisions in the game.

So how can we fix this? With real-time music generation.

Why Real-Time Music Generation?

Real-time music generation presents several advantages in comparison to the current standard of looping several different cues in games. Due to its dynamic nature, the music can adapt and evoke different emotions almost instantaneously in response to the player’s interactions with the game world. When used effectively, real-time music can be prescient and alert the player of what comes next even before it appears on the screen, whether it is impending danger or a happy camper.

So if real-time music generation is clearly superior over old-fashioned music cues, then why hasn’t it become more prevalent in today’s games? Well, it is difficult to synthesize sounds from scratch in C and its variants, which are often the go-to programming language for cross-platform game development due to its high speed. Creating an algorithm that can emulate music structure, form, and melody is no small task either. And while there exist many music programming languages dedicated to the creation and processing of sound signals, interfacing these languages with C and embedding them inside complicated game engines has been a daunting task. Until now.

RTcmix: The (Musical) Agent of Change

RTcmix (Real-Time CMIX) is a programming language for real-time digital sound synthesis, signal processing, and music creation. It was developed in the 1990s as an improved iteration of CMIX (an older music programming language) with real-time processing capabilities, hence the “real-time” in its name. uRTcmix was created a few years ago and is a package for Unity (a cross-platform game engine) that allows RTcmix to be embedded into any VR/AR/desktop/mobile game. Developers can write their algorithmic music generation scripts in RTcmix and call the scripts into Unity’s default C# environment using uRTcmix, which opens a wide range of possibilities for music to be more intertwined with the game’s storyline and player experience. RTcmix and uRTcmix are free, making it accessible to amateur and professional game developers alike.

One of the main advantages of using RTcmix is that the default script parsing/interface language is Minc, a simplified version of C. This significantly reduces the learning curve compared to other music programming languages like Max/MSP and SuperCollider, and it allows developers who are familiar with C-based programming languages to quickly learn RTcmix and implement music algorithms in their games. RTcmix can be used in other environments like Perl and Python, but uRTcmix currently uses only Minc. Another feature in RTcmix is the PField, a control mechanism that allows developers to dynamically control an instrument’s parameters (note frequency, filter cutoff frequency, ADSR envelope) as the music is being played. With uRTcmix, PFields can be accessed and modified in Unity based on parameters within the game. RTcmix includes several physical models of instruments for developers to quickly create their algorithms without spending too much time on sound design.

As of this writing, there also exists another music programming language which can be embedded into Unity called ChucK, a music programming language created in the 2000s. ChucK has a unique programming syntax tailored towards composers and researchers for real-time music performance. Through the use of their free package called Chunity, ChucK scripts and commands can be accessed and executed within Unity, although I haven’t tested it out myself.

Case Study: Emtopia

To demonstrate how real-time music generation works in the context of games, I created a small VR application called Emtopia using Unity and uRTcmix that incorporates algorithmic music generation with different emotional themes in each scene. In Emtopia, players explore the world through the viewpoint of an anthropomorphic marshmallow, and the world has two settings: home and the outside world. When the player is home, the music evokes warm, happy feelings to symbolize that home is a relatively safe place, whereas when the player is in the outside world the music is eerie and unsettling to embody the scary, invisible unknowns of a desolate, coronavirus-lockdown world. Below is a short gameplay demonstration of Emtopia.

The real-time music generation’s parameters (scale, tempo, time signature) are adjusted depending on the emotional theme of the current scene. To keep things relatively simple, the music has a bassline, harmony, and melody, and it follows a 4-bar phrase structure. The harmony and melody are both randomly generated using pre-defined lists of chord progressions and rhythms, respectively. The melody is generated using central ideas called “motives,” which are created and reused using different music compositional techniques. This allows the music to have both variation and unity without feeling like an endless run-on musical sentence. I was able to create Emtopia’s music with approximately 500 lines of code of RTcmix. Pretty good, right?

Conclusion

Music is an integral part of games, but the current process with music cues fails to engage with the player and adapt to the player’s actions. Real-time music generation can be used to great effect in games to enhance the user experience and deepen their emotional connection with the storyline. Music generation algorithms can be modified to output any style of music imaginable: programmers are only limited by their imagination for the soundscape they want to create. And Unity and uRTcmix make real-time music generation in games accessible to anyone who wishes to incorporate this new form of music composition.

So if you’re a developer looking to make game music without causing ear fatigue for the player, then choose real-time music generation to create the best experience possible.

--

--

--

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

NFT Games | Aspenlabs.io

I’ll TELL you a TALE of Gaming — By Wade

NVIDIA ray tracing used to great effect by indie studio XeloGames

NVIDIA ray tracing used to great effect by indie studio XeloGames

Razer Basilisk V3 gaming mouse serves up a serious scroll wheel and tons of RGB

Razer Basilisk V3 gaming mouse serves up a serious scroll wheel and tons of RGB

The Next Generation Delayed: Indefinitely…

Take Gaming Seriously, Its A True Mirage of Life’ s Approach (Lens To A Better Living)

Let’s All Scream Together!

Arcade’s Revolutionary Mission Pools — No One Gets Left Behind

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Shashaank N

Shashaank N

More from Medium

Simple yet effective cooldown system in Unity

Space Shooter 2D — Project Setup

How to Use Post Processing in Unity

Guide To Making a Beautiful Game: Installing the Universal Render Pipeline