Unit Testing with UIGestureRecognizer

With Xcode 7, Apple introduced UI Testing to make it easy to add black-box user interface tests for our apps. While these allow us to automate much of our UI testing, the black-box nature of this environment means that we cannot test the underlying code in our applications while running these kinds of tests. If we want to do that, we need to rely on good old-fashioned unit tests.

Testing Gestures

In the olden days of iOS 3.2, Apple bequeathed UIGestureRecognizer and its numerous subclasses to us, which were an absolute godsend for anyone who’d had to try to achieve what gesture recognisers do using just the UIResponder methods.

Whilst being brilliant in use, gesture recognisers can be a pain to test. We can easily access a view’s gesture recognisers through the public gestureRecognizers property, so finding the gesture recognisers that we want to test is easy. However, forcing the gesture recogniser to fire its associated action methods is not easy as the targets and actions specified in the designated initialiser and -addTarget:action: method are not publicly available. In order to test them, we need to find a way to get gesture recognisers to behave as if they have recognised a gesture when no such gesture has been applied.

Attempt One: Modify the Recogniser State

At first you may think that changing the gesture recogniser’s state to simulate the gesture would work:

This does actually work — the gesture recogniser will fire its action messages. The problem is that it waits until the next runloop before firing. As such, any asserts made in the test will fail. We could of course make our test asynchronous here and be done with it. However, anyone who’s worked on a project with a serious number of unit tests knows that asynchronous tests can cause all kinds of weird and wonderful test failures in completely unrelated tests, especially on continuous integration systems that may be running on underpowered hardware. This makes developers angry and also makes the baby Jesus cry, so we should avoid asynchronous tests wherever possible.

Attempt Two: Runtime Hackery

In order to force gesture recognisers to fire immediately, I consulted the wonderfully useful iOS Runtime Headers. There is a private class in UIKit called UIGestureRecognizerTarget which is used by gesture recognisers to store their targets. Once we have accessed these targets, we can enumerate them and manually call the action methods using the Objective C runtime:

This approach works and allows us to write synchronous tests.

Conclusion

I have created a sample project that demonstrates both of these approaches which you can download here. There is a class called SwipeView whose gesture recogniser is tested in SwipeViewTests.m. One of the tests will fail as it uses the first approach, and the other test passes because it uses the second approach.

It goes without saying that the runtime hacks shown above should never be shipped in an application - you should never want to trigger gesture recognisers in this way when your app is running normally. The code should exist in a file that is only included in unit testing schemes as your app will likely be rejected during review if this code is present. Also, as this kind of hack relies on private API it could very well be broken in a future version of iOS, so its use should be limited to cases where you really do have to test a gesture. However, in such scenarios this technique can prove to be invaluable, and hopefully by sharing it here it will save a few other programmers from several hours of head scratching.

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.