FMOD integration to a mobile gaming platform

Jakub Kopriva
Lonely Vertex Development
10 min readJul 30, 2019

--

In this article I will be addressing basic overview of FMOD Studio and its integration with Unity engine. Note, that there might be some details that are specific to our use case - mobile plaftorm. You should visit FMOD documentation website and another tutorials if you wish to know how to work with the FMOD Studio and its integration libraries.

TLDR:

If you know what is the FMOD and FMOD Studio you can skip the intro section.

I will point out issues that we were dealing with and sample code snippets to give you insight into how did we solve them. It is not any mantra that you should stick to, I am just describing the way we’ve been progressing and how did we end up achieving our goals and needs.

Outline

  • Section 1Introduction into FMOD and adaptive music
  • Section 2 — Music Manager connecting FMOD integration with our game
  • Section 3 — UX and music in mobile gaming
  • Section 4 — iOS and Android native plugin to control the native music player built in the OS
  • Section 5 — Issues we discovered on our way and the edge cases

Section 1 — Introduction into the FMOD

In this section I will briefly introduce FMOD, its purpose and describe, what is adaptive music in couple examples. If you already know all of this, please feel free to skip this part.

I will start wider. Sorry about that 😃

For the easy comparison, I will use a game (non-linear story) where the player usually affects the story by his/her actions. As a counterpart to a game, I will use a movie in the cinema that has been edited and it has defined timeline.

When a Luke Skywalker steps into the Cantina on Tatooine in New Hope there is the music that is played by a band. Moviemakers have the timeline under control and can mix the music and SFX during post-production. On the other hand, a game is a living creature. You as a game maker don’t know when it will happen and how long the player stays in the area (if it’s not pre-scripted cut scene). A better example would be a combat music in RPG like Kingdoms Come: Deliverance. Once you are spotted by an enemy the music changes and becomes more dramatic. Starting open combat is accompanied with even more dramatic music. Of course, there are games which have been made based on music. For example, Geometry Dash or Beat Saber uses music as a core game mechanic. So the gameplay and level design is based on the music and not vice versa.

The process where circumstances change the mood of the music to another one is commonly known as adaptive music. It depends on your needs and how you want to affect or attract the player with the music. It can be a change of ambient music like in World of Warcraft when player steps into a new location. Or the music can hold important information like player health bar status or stealth level.

The FMOD is a tool that will help you to connect the player actions with the music itself. There are multiple ways of how to work with adaptive music. Try to Google horizontal resequencing or vertical reorchestration and you will find plenty of tutorials. You should remember from this part that FMOD is a tool that gives you an easy way how to create adaptive music for video games. Also please keep in mind, that FMOD consists of two separate parts.

  1. FMOD Studio — A kind of IDE where you design adaptive music and sound effects (does not serve for composing through).
  2. FMOD Integration library — a library that connects the built project from FMOD Studio with your code by providing a high-level API.

Section 2 — Music Manager

In this section we will take a look how do we control the music and SFX from code.

First of all, I don’t know if we are doing it right, but it works for our use case. And we are kind of happy with that approach.

During the first scene (loading scene) we instantiate a DontDestroyOnLoad game object. This object is a singleton instance that can be called from anywhere in the code.

MusicManager.cs is a script that has a single responsibility. The responsibility is to do the communication between the game and the FMOD project instance.

MusicManager.cs

The main responsibility of the script is to translate an action, that happens in the game to a command that FMOD will understand. This is straightforward. It also does hold values that player can change in the settings e.g. SFX and music volume. The volume is a persistent setting that is transformed into an FMOD parameter.

As you can see the musicVolumeParameter changes the FMOD MusicVolume parameter that has automation bound to the master track volume. That linearly decreases the volume of the output, this is a very simple example. Once you dig into the FMOD studio you realize that you can control anything by setting a number to a parameter instance. You can even control multiple knobs and effects by a single parameter, so it is highly customizable and scalable based on your needs.

Section 3 — UX and music in mobile gaming

Again, I will write how do we wanted the music to behave in our game. This is not a study how to do it properly, but our observation from the mobile gaming industry that might be skewed by our expectations.

Because, when two unrelated music tracks play at once the result mix is usually horrible. We decide to stop the music playing in the background (Spotify or Apple Music) when our game starts. I will point out some weird behavior of Unity together with FMOD integration. It has been mainly observed during our development process. First of all, you can enforce the other music players to stop when starting your game by enabling the Mute Other Audio Sources checkbox in Player settings.

Mute Other Audio Sources option

Anyway, this option will be ignored if you enable the option in Project Settings -> Audio -> Disable Unity Audio . Do not ask me why 😃. From what I have read about this issue, it is the desired behavior. So it's not a bug, it's a feature 🤷‍♂️

After Disabling Unity Audio the Mute Other Audio Sources is not respected.

Combination of Disable Unity Audio and any value in Mute Other Audio Sources on will always stop the native music player (observed on the iOS platform). This is an official setup guide from FMOD Troubleshooting page.

It is recommended in general to disable Unity’s inbuilt audio but some platforms will not work with it enabled. * Xbox One, iOS and Wii U require Unity inbuilt audio to be disabled when using the FMOD Integration.
https://www.fmod.com/resources/documentation-api?version=1.10&page=content/generated/engine_new_unity/troubleshooting.html
2019 June 26

In upcoming release 2.0, they recommend disabling the Unity audio for all platforms.

Let’s complicate the things a little bit and do some UX now

We wanted to stop the native music player only in the case the user has enabled music in the game. So in case, the player set the music volume to value >0 and has not muted the game music by a toggle. To do this we need to load the game, read the settings and then decide whether to stop the music player or not. Well, the configuration from above did stop the music every time.

We dig in deep into the FMOD C# code to find that the FMOD is stopping the native music player every time the app gain focus. Search for Unity lifecycle methods OnApplicationFocus and OnApplicationPause inside the RuntimeManager.cs script. Commenting out those code fragments will not fix the issue with music player been stopped on the first launch. It just helps in case you suspend the app, hit play in the music player and resume the app. The native player will be playing. So we are halfway done. The last part is to find a code that will stop the player during the application init phase. By observation, you will find out that the music player is stopped once the application starts, even before showing the Unity logo 😳. What the heck? That is a point where the FMOD is not instantiated, so it does nothing to do with the FMOD itself. Well that is the problem the Mute Other Audio Sources flag is not respected in case you disable the Unity Audio.

Final resolution

We wrote a C# post-process build script that modifies the built XCode project files. More specifically we need to add two lines into a file XCODE_BUILD_PATH/Classes/main.mm .

main.mm with added line #1 and line #7

The source code for the post process build script is visible on the Gist below. A link to the final open-source repository will be attached at the end of this article.

NativeMusicControllerPostProcessBuild.cs

The important part is the check for the existing lines. This file is not rewritten during an “Append” build. So you would end up with multiple imports.

Right now we have allowed the native music player to be playing during the gameplay. As I mentioned above, this was not the desired behavior. So we need to check if the game music should be playing and stop the native player. The first part is very easy, we are storing the player music volume settings in the PlayerPrefs so during initialization of our game (before the game music start to play) we check the condition and stop the native player in case the game music should be present in the game.

NativeMusicController.cs the methods starting with an underscore are imported from the native plugin.

As you can see, we call some kind of native plugin. The code of it is present in the repository on the Github (link at the end of the article). By calling TryToStopMusic method we are checking the condition. If the condition is satisfied we are telling the OS music player instance that the game is requesting to be dominant in the game. By calling ClearAudioFocus we release the music instance to be available for another application. We are calling TryToStopMusic during the first initialization of the game and also during a transition between suspended and running state of the App.

Section 4 — Native plugin

This section will give you an insight how the native plugin for iOS and Android works. We did also open sourced the plugin source code under LGPL you can find the link at the end of the article.

Both iOS and Android provides an API that has access to the OS music player. You can set a mode of playback that your application requires. You can find all the modes in the iOS and Android docs. A native plugin has three parts. One of them is, of course, the native code written in the native language. The other one is a bridge between C# and the native code which is exported as an API for the plugin. And the last part is a C# code that is importing the API provided by the library/plugin and using it in the C# code. I will use an iOS example below.

MusicNativePlugin.h header file exposing the API
MusicNativePlugin.m that provides the implementation for the API
NativeMusicController.cs script that imports the API from the native script and calls the exposed methods.

As you can see, it is not that hard to call native code from Unity. But of course, the native code calls come with a cost. The price you will pay is that you have to maintain the code for each platform. Respond to breaking changes in the native SDK. But sometimes it is the only way how to achieve the desired functionality.

Section 5 — Issues we are fighting or we resolved

I will repeat the biggest issues we had. Mainly caused by lack of documentation on the Unity and FMOD side. To be sure how the Unity behaves when you set a Disable Unity Audio flag we had to create an empty project and test the behavior there. Because we were not sure if the issue is caused by FMOD integration or our lack of knowledge on how to use it properly.

One issue that is not mentioned above is that we are using a Unity Cloud Build (UCB) during our development. It makes our work much easier for testing the open pull requests inside our team. During a build in the UCB, the FMOD does not copy the newly built banks into our Streaming Assets folder. Although it seems like we resolved the issue by enabling the Play mode tests before the build. It is not proven and we are still investigating on the issue.

References

Jan Slifka wrote an excellent article about automating build process using a Unity Cloud Build. A custom solution that uses a BitBucket webhooks and creates a new build once there is a new Pull Request.

Also, there is a great article about C# script performance by Ondrej Kofron.

And there will be more articles from my colleagues about Unity shaders and we will show more about our game in the upcoming weeks. So please hit the follow button on Lonely Vertex on Medium, Twitter, Instagram and subscribe for news on our Facebook page.

GitHub repository with all the supporting code can be found here. If you have any issue you can enlist it on the GitHub and feel free to create a Pull Request for any enhancement.

Final words

Please do not hesitate to leave a comment down below in case you like the article, want to ask a question or have a better approach. As I said, this is not any kind of best practice or mantra on how to work with music in the games. That is the approach we used and works for our use case. We wanted to share it because we thought it might be helpful to other beginners in the game development industry.

Next time I will write an article about how did we solve a localization for our game.

We are Lonely Vertex, a small indie game studio located in Prague, Czech Republic. Currently getting ready to release our first game, Sine. You can subscribe for newsletter, read our development blog posts or follow our progress on Twitter or Facebook.

Cheers, JK.

--

--

Jakub Kopriva
Lonely Vertex Development

Member of indie game studio Lonely Vertex, software engineer since 2015. Responsible for most of the paperwork in company and of course the programming.