On updating to ExoPlayer v2 and how we test our integration

Hello again, my name is Ross Beazley and I’m a developer in BBC Media Services. The BBC Media Services team are responsible for the ingest, transcoding, packaging and IP delivery of all of the media content in the BBC. On the IP side, the BBC’s Standard Media Player (SMP) is used for playback of that media in our mobile and web applications — In essence SMP is the client SDK of Media Services.


At the start of the project in 2015, we used ExoPlayer at the core of our SMP implementation. We had fully embraced ExoPlayer, in the DDD (Domain Driven Design) parlance we were in a conformist relationship. ExoPlayer classes existed within the heart of our codebase and also appeared in our UI tier. Our tests were expressed in terms of ExoPlayer classes eg an action in our test might be ExoPlayer signalling READY or an assertion might be around the values dispatched to ExoPlayer’s seek method.

ExoPlayer now lives in its own subdomain, called a Decoder, and is adapted to conform to our model rather than our model conforming to ExoPlayer’s.

The design of SMP takes inspiration from Hexagonal architecture; “Decoder” is modelled as a port that we plug into different technology specific adapters — there is one for each type of media we play.

Leveraging this software design has allowed us to perform a phased roll out of ExoPlayer V2, slicing off each media type as we go; we started with the audio on-demand use case.

This article focuses on the techniques we use when developing a technology-specific adapter in a hexagonal-architecture style system.

Adapters come in two parts

In general we find our adapters are made up of two parts and a factory (or builder) to bring it into existence. The two parts are a plain old code object (POCO) and a very tech specific object: in the specific example of a AudioOnDemandDashDecoder we have:

1. An “API Narrowed” ExoPlayer 2 object.

2. Code that adapts the Decoder API calls and types into ExoPlayer API calls and types — an anticorruption layer between the two subdomains.

3 A factory that builds the object with the correct configuration for the media type.

With ExoPlayer V1 we had willingly disregarded the “Don’t mock what you don’t own” advice (some discussion here). Instead we spent time understanding how ExoPlayer behaved and encoded that in our own mock object. When integrating new releases of ExoPlayer this hand made mock object potentially needed updating.

This time, however, we a better understanding of media and how to encapsulate that understanding into tests — this has allowed us to integration-test our API-Narrowed ExoPlayer object. We can then use a mock version of this api narrowed object in our fast tests. The resultant behaviour of this mock, driven out through collaboration tests on the adapting side, defines the contract that the slow integration tests of the real ExoPlayer instance verify.

We are left with a set of very fast tests (milliseconds rather than seconds) to verify our adaption code. This slow/fast divide encourages us to keep the conditionals, complexity and other fun stuff in the adaption side (for more info on contract and collaboration tests refer to this JBrains article). But how do you reliably integration test ExoPlayer?

Integration testing ExoPlayer

WireMock is one of the main technologies behind our integration tests. An instance of WireMock is embedded directly into an on-device connected test. It is used to serve DASH streams to ExoPlayer and simulate different network conditions for our ABR (adaptive bit rate) related tests. The assertions that our tests use include pixel colour sampling and audio sampling to verify things are playing.

The purpose of these tests is not to extensively test ExoPlayer works - the ExoPlayer team have spent a good amount of effort doing this. Instead they are to verify we have we integrated ExoPlayer correctly for the features we want. This is why I prefer the term “test of our integration” rather than “integration tests”.

What follows is a explanation of the example tests in this Github repository and how we use them to verify a feature has been integrated correctly. At the time of writing there are five tests, A through to E. These verify that we can prepare a stream for playback, produce some audio, render some video and report on bitrate as we ABR up. This is not an exhaustive list but, for the purposes of this article, should illustrate to you some of the more interesting techniques to test your integration of media playback.

Test A: Wiremock Setup

Before we do anything, you will want to get WireMock working. You can see a basic WireMock test in A_JustWireMock.java, it spins up WireMock (using the junit Rule), sends it a request and verifies the request is received.

You will also have to declare some WireMock dependencies in your modules gradle file:

With this in place you can be sure your WireMock setup is correct and can move on to the fun stuff.

Test B: ExoplayerPreparesStream.java

The first part of our journey is preparing a stream for playback. The test asserts ExoPlayer has prepared our stream by asking WireMock to verify it received the request for the manifest (there are a few more things you can check but let’s just focus on this). In many ways this test will become redundant as we move on to starting playback, it’s up to you if you keep it or delete it.

This, however, requires us to introduce another WireMock technique, serving files. WireMock can serve files from a static folder, but on an Android phone this probably means the SD card. We have a WireMock Transformer teamed up with an implementation of a FileSource to load files from the test assets folder.

We initialise the WireMockRule with this configuration and then create our stub mappings to always attempt to serve files matching the URL path.

A little side note about the FileSourceAndroidAssetFolder; WireMock expects files to be in a __files folder, the two underscores are stripped from the path before accessing the assets resources as shown here. The streams parameter maps to a folder (to keep things tidy) within the assets resources containing the WireMock __files directory.

The test creates a new player for a given URL, waits for it to signal it’s ready and then verifies the request for the MPD was made.

This may be enough to give you confidence that things are wired up correctly. You can continue to specify more assertions, but try not to over specify.

Next up we look at starting to play this stream.

Test C: ExoplayerStartsPlaybackOfAudio.java

To be kind to the ears, an audio only stream with a 50hz tone is used in this test and things start getting interesting as we employ the use of an android.media.audiofx.Visualizer in our assertion.

xThe test follows on from the previous test, hence the setup is the same. The test creates a new Visualizer, enables capture of the output mix (the constructor parameter), plays for a bit and then grabs the FFT (Fast Fourier Transform) from the Visualizer.

For the purposes of this test we only check that sound is made, not what frequency bucket maxes out.

Actually, this implementation is wrong and should maybe be just adding up the real parts, but you get the point. Listen to this test fail by breaking your player.play() implementation, or rather see it go red then go green as you implement player.play().

The test hooks into the output mix rather than the specific audio session. Accessing the output mix requires the RECORD_AUDIO and MODIFY_AUDIO_SETTINGS permissions, we use the GrantPermissionRule from Android’s test support libs.

You can expose the local audio session ID if you like to avoid this permissions problem, give it a (test first) go!

Test D: ExoplayerStartsPlaybackOfVideo.java

Now to add video playback. For this test we are going play a short video clip of less than 8 seconds. It first displays a full frame of red and then changes to green. The setup for the test is slightly different and we need to spin up an activity into which we inject a TextureView. We can then sample the pixel colour in the TextureView and assert on its green-ness. Aside from that it’s basically the same, declare WireMock stubs for a locally hosted stream and begin playback.

We use an ActivityTestRule to start our Activity (its declared in the Test AndroidManifest).

And set its content view to the newly created TextureView. When its SurfaceTexture is created attach a Surface made from this to the player.

The test lets the video play for some time and samples the colour of the pixel at coordinate [100, 100] asserting its colour is GREEN

Test E: ABRsThroughRepresentations.java

The test covers two features in one go. We are going to check we have enabled ABR for our streams and also that the player reports the bitrate it’s playing back. Actually, we are going to use the reported bitrate to confirm ABR is working. If that makes you fall off your seat then we could do more, for example check what segments have been requested from WireMock, or have a stream with a different colour for each representation and sample those colours.

The test starts playback and uses a journalling listener to capture the reported bitrates. The test waits until playback has stepped up three rungs on the ABR ladder, or a maximum time based on the segment length.

The test uses a feature of WireMock called ChunkDribbledDelay. With this we can deliver each segment in 1KB blocks over a known timeframe, in essence this allows us to restrict the network connection to a specific bitrate. This bitrate increases over time — i.e. if later segments can be delivered faster, ExoPlayer ABRs up.

The implementation of this is not ideal, at the moment, and is mostly buried in the setup of the stubs.

A transformer very similar to the existing static file version is used to apply the bandwidth restriction. The application of the bandwidth restriction all hinges around the use of withChunkedDribbleDelay. The method takes two parameters: number of chunks and time taken to deliver those over the network. When the file is loaded from disk the number of 1KB chunks is determined and the length of transfer time to meet the bps (bits per second) is calculated.

The test assumes the later segments can be delivered over a faster network, it uses some very crude string comparison of the name to work this out. I’m sure you could do a much better job.


So to wrap up, when dealing with technology specific integrations try not to fall into the trap of testing the features but rather your integration of the feature. You will need to know how the underlying technology works in order to setup the test context. I hope you have been inspired by something you have read here and can apply this to other problems you face during your development career.

The BBC is always hiring.