Lessons learned in audio feedback for game and app design

Audio feedback in UI-Design

When we talk about feedback in Interface Design most people think about mouse overs, animations, colour contrast and inspirational copy that makes the user feel good about using your product. But designers usually focus a lot on visual feedback and forget about the other sense that we can currently benefit from while using computers: hearing.

There are many problems that make sound come as secondary feedback in digital interface design. In most cases, we can only be sure that the screen is turned on, as this is usually the main part responsible for feedback, so we don’t know if the speakers are turned on, if the volume is high enough or if the user is using headphones or not.

Even though there is no proof that audio feedback is any more effective than visual feedback under normal circumstances, it is known that sound (music, namely) has a deeper capacity of changing our mood and way of perceiving the world around us. This is the reason why when you watch a scary movie without sound, it suddenly is not as scary anymore.

Reaching out to our users’ emotions is an efficient way of getting them to pay attention to us, which is why emotional branding is so effective. Even through very quick snips of music lasting only a few seconds we can create a mood and even an impression of happiness or sadness to the user. For example, the error sound in a computer is alarming and not pleasant to hear, but a score-up sound effect in a game is.


While working on my latest project, an action-puzzle game for iOS, I had the opportunity to create sound effects for the gameplay and for the menus interface. In that process, I learned a few things about sound design and user experience related to sound:

  • Sharp sounds should be avoided. Sounds that start with a big amount of energy (technically speaking: low attack level) bother users very much. Try to smooth sharp sounds with a little fade-in or adjusting their attack, 10ms make a big difference.
  • Sounds shouldn’t be repetitive. The number one reason people turn off the audio feedback on their device’s keyboard is because they hear the same sound over and over without change. If an effect needs to be present all the time, provide different versions with changes in volume, pitch and frequency (filter) so that it feels more natural and random. In the real world, no sound sounds exactly the same when executed twice. Just so you have an idea, each shot your fire on games like CoD and Battlefield is treated slightly different and loaded randomly from a sample database so that the automatic machine-guns don’t sound like a looping, lifeless sound.
  • Sound effects must be designed to interact with themselves and with music. When creating sound effects or music score, don’t forget that they are going to be played among other effects, music and external sounds, even though you don’t have much control over the latter. Make sure the music doesn’t use the same timbres as the sound effects so that they don’t overlap and cause confusion; treat the volume of the different layers so they won’t distort when played together; implement a ducking system if necessary.
  • Don’t re-use OS sounds for your software. If you want it to stand out in the OS environment, create exclusive sounds for your notifications, your main features, but if possible use standard, OS-wide sound effects for errors and important messages that need attention, which leads us into the next item…
  • Don’t underestimate the user’s power of recognition. If the OS already has a sound for an error or warning, then the user is already familiar with it, and even if it doesn’t fit with your app or game’s style, you should consider using it instead of designing a custom solution. This cuts down precious seconds for the users in learning this new sound and attributing it to an important action. He might even ignore your alert if he doesn’t recognize it as important.
  • Use each OS’s particular sounds on their respective platforms.
    Many users have two devices, a personal one and a professional one running different OSs, and they don’t want to hear an Android alert sound while playing an iOS game and vice-versa. Not only they might get confused, but for an iOS user that has never used Android (or vice-versa), the Android (or iPhone, or Windows Phone, ETC.) sound effect will sound like a custom, out of place sound.
  • Let users choose what they want to hear. This is pretty much basic nowadays, but be sure to include controls for music level, SFX level and speech level, if available. Allow them to adjust each layer of audio and mute the audio altogether.
  • Produce in a high quality system, test in low quality systems. Producing the soundtrack with your $500-dollars-headphone might be extremely exciting, but most people will be using your app without headphones, relying on their mono, bass-less device’s speaker, or listening to you game on cheap headphones while riding the subway. Mix and master for these scenarios and don’t rely on things such as panning or surround effects in your gameplay.
  • Don’t trap sounds to a certain frequency. People hear things differently. I might hear more bass than you, your grandma might not hear treble at all. My headphones could ‘speak’ a lot more medium than yours and so on. Try to make it so most sound effects reach a wide range of frequencies so that they are perceived by everyone at some level. Even if you start with a timbre that imprints a small range of frequencies, layer it up until it spreads across the spectrum.
  • Not everything needs a sound effect. While all things in the real world have a distinct timbre, in the virtual world we are making up sounds for most things. And some of those things shouldn’t have a sound at all. Also, some things could have a sound effect, but they shouldn’t, because they would just get in the way of a clean experience. We, for example, have decided not to have a click FX or an error sound in our game, because the tap action already has visual feedback on all buttons and links, and because the error messages are provided by the system with a pop-up style the user knows too well.
  • Unless music is part of the gameplay experience, keep it low-profile. In games such as 140 the music is a character that makes the gameplay change and sets the rhythm of your moves. It has to be up front in the mix and it is necessary to make the game a complete experience. In our game, the music is very important to set a mood, but it is not necessary for the gameplay. Therefore, instead of making the game music punchy with strong bass and complex melodies, I’ve written tunes that fill up your ears gently with simple melodies and little build-up. The beats are simple and there is very little tension between phrases. A theme is followed so that all tunes are in the same key, but they vary enough so that each track has its own personality. They are made to loop infinitely without making you tired, and this is a hard task.
  • Create melodies in layers. When writing background music for gameplay, make many variations of the same melody and alternate them between the various instruments on your track. This will keep the theme alive during the song without getting too repetitive and keeps the listener’s interest. Give the bass a chance to play the lead guitar’s part, you’ll be surprised with the results.
  • Take your time. Not everything will be perfect right ahead. The music will sound weird at first, and it might take a while for you to get the project’s spirit, or you might find that your sound effects don’t reflect the style of the user interface and more research is needed. Take time, make notes, listen to user feedback, try crazy things, but don’t get satisfied too fast. We can always do better if we try.

I hope these tips will be of help to other sound designers who are planning their projects and to developers who want to think about the complexity of building an audio engine for games. Keep on playing!