Shazam on iOS: UI testing + Shazam Pro User Tip

Nikita Kardakov
4 min readMay 16, 2018

--

Here at Shazam we were trying to use UI testing from the very beginning. Do you remember writing these test scripts in Javascript? But until very recent moment we were unable to really integrate it into our process. In the era of Javascript (seriously, could somebody remember how this technology was called?) everybody had to write code to dismiss popups and even the most straightforward tests were becoming unusable after introducing even the smallest change.

It certainly became better when XCUITest was introduced but we were lacking one more thing – inter-app communication and multi-app testing. When we had to use CFNotificationCenter for these purposes it didn’t felt right and we abandoned the idea for a while.

But since XCode 9 we now can run, terminate and communicate between different apps simply by referencing their bundle identifier.

let app1 = XCUIApplication(bundleIdentifier: "com.Shazam.app1")
app1.launchEnvironment = ["key":"value"]
app1.launch()
let app2 = XCUIApplication(bundleIdentifier: "com.Shazam.app2")
app1.terminate()
app2.launch()

So now we can run multiple apps. Simply don’t forget to add all the apps to “Target Dependencies” of your UITests target and XCode will install it on your device or simulator automatically.

So in Shazam we care about Shazaming and using these simple tools we now can build some UI tests for our core functionality.

Listening to Kent Beck many years ago on a workshop I remember him mentioning audio as one of the hardest things to write automated tests for. It is certainly true (we have more sophisticated tests for testing our internal audio tools) but using these new tools we can create a simple setup to test what we need.

We ask PlaybackApp to play some audio (we’re using external url’s to download and play audio file).

// Starting a playback app and passing an audio file URLlet playbackApplication = XCUIApplication(bundleIdentifier: "com.Shazam.TestPlaybackApp")playbackApplication.launchEnvironment =["url":playbackTest.urlString]playbackApplication.launch()

And now we start the Shazam app and start Shazaming!

// Starting a playback app and passing an audio file URLlet shazam = XCUIApplication(bundleIdentifier: "com.Shazam.***")
shazam.launch()
let shazamButton = ...
shazamButton.tap()

Now we wait if the right title shows up (we actually do it a little differently but that would work as well).

let artistLabel = shazam.staticTexts["Yusef Lateef"]
let titleLabel = shazam.staticTexts["Love Theme From Spartacus"]
// oh yes, we have a good taste here
let exists = NSPredicate(format: "exists == 1")

expectation(for: exists, evaluatedWith: artistLabel, handler: nil)
expectation(for: exists, evaluatedWith: titleLabel, handler: nil)
waitForExpectations(timeout: 10, handler: nil)

We’re good at what we do and the actual test runs pretty fast!

Even based on this simple example you can think of many things you can do with it. To name a few – we mix in noise for some of the tracks, we run many tests like this in a row using XCode bots on different simulators and devices and we report the results using our reporting system. Based on that we know what’s the most recent “Shazaming time” and we can anticipate problems with our most important functionality before hundreds of millions of our users would. We have other kinds of UI tests — checking the most important flows on different devices, using different languages. But that’s out of the scope for today.

Bonus Track: Shazam Pro User Tip

As you might have read in other blog posts on “Inside Shazam” – we really like to measure things. One of the key metrics we have is “Shazaming time” that I’ve mentioned above and “Recognition rate” (what percentage of users end up with the name of the track that is playing around). We monitor these metrics all the time because these things make our users happy.

And we also measure these things in relation to other conditions. A little time ago related to other experiments we decided to check how users hold their phones before and after Shazaming.

As you know iPhone supports different orientations: portrait, portrait upside down, landscape right, etc. If you ever used the orientation in your code you know that it’s not very accurate but there’s nothing wrong with checking it anyway.

So we measured it for a bit and we’re happy to report that holding your phone in portrait mode all the time will get you statistically significant benefit of getting your track right! So please stop turning your phone, hold it still and wait a few seconds. Like this.

Hold your phone in portrait mode!

--

--