Using FFmpeg for faster audio decoding

Building and packaging FFmpeg for Android

Don Turner
5 min readMar 22, 2019

I work with developers who are doing interesting things with audio. Like building apps for real-time synthesis, drumming, looping, DJing, karaoke and musical games.

These real-time audio apps have some common requirements:

  • They are latency sensitive. They need tight control over when sounds are played and usually need to play sounds with the minimum possible delay between the app and the user’s ears.
  • They need to manipulate raw PCM data. Examples include mixing of two or more sounds (e.g. when mixing songs in a DJ app) or applying audio effects to a sound (such as auto-tuning a user’s voice). This type of processing can only be performed on uncompressed PCM data.

Android has some great APIs for just playing compressed audio files (MediaPlayer, SoundPool) but when it comes to audio decoding there’s a slight problem: the decoding APIs aren’t that fast. This can cause long loading times which degrade the user experience.

Enter FFmpeg. Here’s a comparison of MP3 and AAC decoding speeds compared to NDKMediaCodec.

Test setup

I have a music file with the following properties:

  • Duration: 5 minutes
  • Channel count: 2 — stereo
  • Sample rate: 48kHz
  • Sample format: 16-bit integers

Decoding device: Google Pixel XL running Android 9.0.0 (Pie)

FFmpeg is over 10 times faster.

That’s a serious performance improvement. The remainder of this article will explain how to build FFmpeg and use it in your app.

Getting started

I using a Mac as my host machine. These instructions should be similar for Linux and Windows, however, some tweaking with paths and configuration options may be required (e.g. replace instances of `darwin` with your OS).

  • Use Android Studio to download the Android NDK by going to Tools->Android->SDK Manager->SDK Tools->NDK. It will be installed in `$HOME/Library/Android/sdk/ndk-bundle`. You can also download the NDK separately if preferred.
  • Clone the latest FFmpeg source code by running `git clone git://source.ffmpeg.org/ffmpeg.git`. At time of writing this was version 4.1 — earlier versions won’t compile for Android without some tweaks.
  • Open a Terminal window and set an environment variable named $ANDROID_NDK which points to the root folder of the NDK you just downloaded.

Compiling FFmpeg for Android

Now we need to configure and build FFmpeg for each Android ABI (armeabi-v7a, arm64-v8a, x86 and x86_64). Here’s the build script I used. Each configuration option has a comment to describe what it’s doing.

All being well you should have a set of libraries for each Android architecture under the `build` folder.

Using FFmpeg in an app

To use our newly built libraries in our app there’s a few things we need to do.

Include the FFmpeg headers

If you haven’t already, add native support to your Android project. Android Studio will use CMake to compile native code. The CMake build is configured using `CMakeLists.txt` which can be found under the cpp folder in the Android view:

I added the following lines under `cmake_minimum_required` to include the FFmpeg headers:

set(FFMPEG_DIR /Users/donturner/Code/ffmpeg/build/${ANDROID_ABI})include_directories(native-lib ${FFMPEG_DIR}/include)

`${ANDROID_ABI}` will be substituted for the target ABI at compile time.

Link against the FFmpeg shared libraries

Also within `CMakeLists.txt` add the following to define the `avformat`, `avutil`, `avcodec` and `swresample` libraries.

add_library( avformat SHARED IMPORTED)set_target_properties(avformat PROPERTIES IMPORTED_LOCATION${FFMPEG_DIR}/lib/libavformat.so)add_library( avutil SHARED IMPORTED)set_target_properties(avutil PROPERTIES IMPORTED_LOCATION${FFMPEG_DIR}/lib/libavutil.so)add_library( avcodec SHARED IMPORTED)set_target_properties(avcodec PROPERTIES IMPORTED_LOCATION${FFMPEG_DIR}/lib/libavcodec.so)add_library( swresample SHARED IMPORTED)set_target_properties(swresample PROPERTIES IMPORTED_LOCATION${FFMPEG_DIR}/lib/libswresample.so)

Update the following line to link native-lib to these libraries.

target_link_libraries( native-lib avformat avutil avcodec swresample log)

We should now be able to use the FFmpeg APIs in our app. You can test this by including one of the FFmpeg headers in a source file (e.g. native-lib.cpp):

extern "C" {#include <libavformat/avformat.h>}

And the following code:

__android_log_print(ANDROID_LOG_DEBUG, "FFmpeg", "%s", avcodec_configuration());

This will output the configuration which was used to build libavcodec The app should build successfully but it won’t run just yet….

Packaging the FFmpeg libraries with our application

Our app will currently crash on startup because the libraries we linked against aren’t in our APK. To solve this we need to update our app/build.gradle to specify a native libraries folder.

Open app/build.gradle and add the following to the android block:

android { sourceSets { main {  jniLibs.srcDirs = ['libs']} } }

This specifies that the app/libs folder will contain shared libraries which should be packaged in our APK. We just need to copy our newly built FFmpeg libraries into this folder.

Here’s a script which does that. It should be executed from the FFmpeg source folder.

Example usage:

$copy-to-project.sh /Code/MyAudioApp/app/libs

Now build and run your app. You should see something similar to the following output in logcat:

2019–02–19 10:50:44.719 8130–8130/com.example.ffmpegtest D/FFmpeg: — prefix=build/x86 —target-os=androidarch=x86enable-cross-compile — cc=/Users/donturner/Library/Android/sdk/ndk-bundle/toolchains/llvm/prebuilt/darwin-x86_64/bin/i686-linux-android16-clang —strip=/Users/donturner/Library/Android/sdk/ndk-bundle/toolchains/llvm/prebuilt/darwin-x86_64/bin/i686-linux-android-strip —enable-small —disable-programs —disable-doc —enable-shared —disable-static —disable-everything —enable-decoder=mp3 —disable-asm

If you see this it means that you’re successfully calling an FFmpeg method from your code.

Optimising application size

When we’re ready to publish our app we can use Android App Bundles to ensure that a user receives only the binaries for their Android device (this is known as Dynamic Delivery). All we need to do is publish a Bundle by going to Build->Generate Signed Bundle.

When configured for MP3 decoding the FFmpeg libraries for a single architecture add around 400KB to the install size. You can verify this by using Build->Analyze APK and choosing build/outputs/bundle/appname.aab

App bundle analysis showing FFmpeg library file sizes

Summary

You should now have everything you need to start using FFmpeg in your app.

For a full example of how to decode audio using FFmpeg and play it using the Oboe library, check out the RhythmGame sample.

You can also use FFmpeg with the popular Android media player ExoPlayer using this extension.

Finally, when distributing your app ensure that you follow FFmpeg’s licensing checklist and comply with any other applicable licenses.

--

--