Accessibility and UI Testing in iOS

Jonathan Chen
24 min readAug 16, 2018

--

In 2015, Apple released XCUITest, its UI Testing framework, as a part of Xcode 7. Wanting to be at the forefront of new releases, we were very early adopters, despite a paucity of documentation and a surplus of others struggling with the same sorts of issues, to little or no avail.

Thankfully by now the documentation has been updated and there have been enough answers (whether on StackOverflow or via Apple docs) that one does not always feel like they are grasping at straws just to make the test scroll up half a screen length. However, this did not come without a certain amount of toil and experimentation. I’d like to share the fruits of those labors in this post.

Contents (search with Command+F/Ctrl+F):

  • But First… Accessibility
    — How To Make Your Apps Accessible
  • XCUITest
    — XCUIElement
    — XCUIElementQuery
    — Interacting with the App in a UI Test
    — Testing in a UI Test
    — Applitools
  • Making Your UI Tests CI-friendly
  • Debugging
    — Finding your test’s snapshot
    — The Accessibility Inspector
    — The Console (lldb)
  • Common Pitfalls/Other Tricks
    — Sleep
    — Categories
    — Don’t know which scroll view to use? Use self.app!
  • Conclusion

But First… Accessibility

Accessibility in iOS refers to the suite of features that a device provides to users who have an impairment of some kind that makes it more difficult to use their device in some way. These features include Assistive Touch, which allows you to customize gestures to perform actions the way you’d like; display accommodations, which changes the color scheme for those with certain vision impairments; or Dynamic Type, which enhances text to a size that is easier to read.

The feature we are most concerned with when it comes to UI Testing is VoiceOver. VoiceOver is made for those who are vision-impaired so that they can navigate their device through the guidance of Siri, who will read for them the UI elements on the screen as they swipe through the view hierarchy using several preprogrammed gestures. To activate VoiceOver, navigate to General in the Settings app, choose Accessibility > VoiceOver > VoiceOver (as of iOS 11.4). You can also navigate to General > Accessibility > Accessibility Shortcut at the bottom of the screen to enable triple clicking the home or lock button to activate VoiceOver.

Here the screen appears to be locked, but the user is actually interacting with an app via accessibility gestures.

We are concerned with VoiceOver in particular because the set of UIAccessibility properties and methods we must override are directly related to the ways in which we query for elements in our view hierarchies when we write UI tests. If we do not understand the purpose of these properties, we may still write serviceable UI tests, but may well compromise the accessibility of our app. Furthermore, UIAccessibility best practices inherently lend themselves to more UI-testable code, which is obviously something we all want.

Take a few moments and see how navigating feels using VoiceOver. Keep in mind that using the phone in this way should feel unintuitive if you are reading this — navigate without tapping by way of vision. You will see that there is a cursor, indicated by the black box surrounding the UI element that VoiceOver has placed its focus on. To get you started, here are a few of the ways vision-impaired users would navigate their device with VoiceOver activated:

  • Double tap: the standard way of selecting an element or an action with VoiceOver activated.
  • One finger left/right swipe: moves the accessibility cursor to the previous/next element in the UI hierarchy, such as to the next cell in a table view, or the next app on your home screen. VoiceOver deals with nested elements in a very particular way, which we will cover later on, but for now just think of the UI hierarchy as a single-layered array-like collection of UI elements that are iterated through linearly.
  • One finger up/down swipe: when the VoiceOver cursor initially places its focus on an element, it will typically read that element’s Accessibility Label (will be discussed later). A moment later, it might speak the words “actions available”. If so, use the one finger up/down swipe to navigate through this list of options, and after the one you desire is selected, double-tap to select it. If there are no actions available, single finger up/down swipe will actually begin to spell out the accessibility label, letter by letter, even denoting which letters are capitalized. You can read letters forwards or backwards this way.
  • Two finger swipe up/down: reads all the way through the entire UI hierarchy. If you swipe up with two fingers, VoiceOver focus is placed back to the beginning of the hierarchy, and read sequentially from the start. If you swipe down with two fingers, VoiceOver reads to the end of the hierarchy from where it currently is focused.
  • Three finger swipe: whichever direction you swipe, this tends to act as a page turn. On the home screen of your device, right/left three-finger swipes will navigate to your next page of apps, or to your widgets. An up/down three-finger swipe will lead you to Spotlight, or back to your home screen.
  • iPhone X users do actually need to know where the home indicator is, and then swipe up to return to the home screen, or swipe up further to activate the app switcher.

Note that these actions and more do not necessarily have default behaviors built into them out of the box. UIPageViewControllers and UIButtons will behave as you expect if you three-finger swipe or double-tap, respectively, but double-tapping a UIView will do nothing. These actions are still customizable by adopting UIAccessibility protocols. Furthermore, building additional actions into a UIAccessibility element is possible through code as well.

Note the rectangle around the FaceTime app — that is the Accessibility cursor.

The reason these interaction styles are important to know is because this is the medium through which XCode UI tests interact with your app as well. The XCode UI Test runner will not be able to interact with any elements that are not visible to the accessibility hierarchy, so you must design your UI in an accessibility-friendly manner if you want your UI tests to be successful. Ideally, this means that apps designed with Accessibility best practices in mind will lend themselves most easily to writing UI tests, and vice-versa. The most UI-testable app is an accessibility-friendly app.

How to Make Your Apps Accessible

A good starting point is Apple’s documentation about UIAccessibility and UIAccessibilityContainer. There you can find all the properties and methods at your disposal to manipulate accessibility properties for your UI elements, but here I will cover the essentials of what you will need to deal with:

UIAccessibility: var accessibilityLabel: String? { get set }

accessibilityLabel may well be the property you will deal with the most in UI testing and accessibility. It is the human-readable string that is read by VoiceOver when the cursor places its focus on it. Consider a UIButton with “Next” given as its accessibilityLabel; the default behavior of VoiceOver would then speak “Next, button”. Again, there are ways to configure the type and additional actions for any UI element, which would be reflected in the way VoiceOver responds to selecting said element.

accessibilityLabel is also of paramount importance because it is one of the primary ways that UI tests should query for elements. Tests are able to query the entire accessibility hierarchy for a button with the accessibility label “Next”, and as long as there are not multiple elements with the same label to be dealt with, the element can be interacted with and tested for existence or “hittable”-ity.

A nice thing about accessibilityLabel is that UIKit elements with a text property like a UILabel or a UIButton with text typically has its accessibilityLabel assigned simply as the same text that was assigned to it. You shouldn’t typically have to assign accessibility labels to these elements unless you want to further elaborate on this element’s behavior.

UIAccessibility: var accessibilityIdentifier: String? { get set }

The accessibilityIdentifer property is not interpreted by VoiceOver, rather it is a developer-facing string that can be used in cases when you wish not to manipulate the accessibilityLabel. This won’t affect your app’s conformance to accessibility, it is strictly a way to make UI testing easier. It is often good in situations when you have dynamically changing content in some kind of table or collection view, and need to query an element by its index or position in its superview. A good way to do this is in your UITableViewDataSource cellForRowAtIndexPath:

cell.accessibilityIdentifer = "cell \(indexPath.row)"

This way, you can be sure that if your table view has at least a certain number of elements, you will be querying an element that exists. In general, a good time to use an accessibilityIdentifier is when you have elements that serve the same function, but may simply have different user-facing names.

UIAccessibility: var isAccessibilityElement: Bool { get set }

If isAccessibilityElement is not set to true, it will not show up in the accessibility hierarchy. However, once again a lot of UIKit elements have isAccessibilityElement set to true by default. According to Apple docs:

The default value for this property is false unless the receiver is a standard UIKit control, in which case the value is true.

The list of UIControl classes includes: UIButton, UIDatePicker, UIPageControl, UISegmentedControl, UISlider, UIStepper, and UISwitch. You can assume that these elements will have isAccessibilityElement set to true by default.

We frequently deal with this property when dealing with elements with embedded subviews. For example, if your custom UITableViewCell has a UIButton and a UILabel as subviews, you should set isAccessibilityElement to true on your cell, and then override the cell’s accessibilityActivate() method to perform the action that the button would have, set the cell’s accessibilityLabel to the text of its UILabel subview, and if you want your user to think of it as a button, you can set its accessibilityTrait to .button. This will make it so that the only element among these that is added to the accessibility hierarchy is the cell, and none of its subview. In fact, if an element has isAccessibilityElement set to true, none of its subviews will be visible to the accessibility hierarchy. This is beneficial because 1) VoiceOver users will have fewer elements to swipe through when going through the app (remember, they can only navigate by swiping or having VoiceOver read everything), and 2) there will be fewer accessibility elements in the hierarchy, making it quicker and easier for UI tests to find the right element, and making it less likely for there to be collisions of elements with the same identifier/label.

UIAccessibilityContainer: var accessibilityElements: [Any]? { get set }

In order for this property to have any effect, the view that is overriding it must have isAccessibilityElement set to false.

In deeply layered UI with view controllers that have child view controllers (such as with a UIPageViewController or UISplitViewController), setting their subviews’ isAccessibilityElement properties will add the elements to the accessibility hierarchy, but will likely behave erratically when used in VoiceOver. This is because each view controller receives the accessibility actions within their own context: if the accessibility cursor is focused on a page view controller’s child view controller, a two-finger swipe up will only read through the elements contained within itself, but will never discover those of its parent. In order for this to behave correctly, the child and parent view controller should implement the accessibilityElements property of the UIAccessibilityContainer informal protocol. The value of this property is an ordered list of elements that are to be considered by VoiceOver.

A solution to the UIPageViewController situation might be for the superview to add its currently visible child view controller, and then its own page control and any other elements to be considered to the accessibilityElements array. This will allow VoiceOver to iterate through its child view controller’s accessibilityElements array, as well as its own elements, in the desired order. Then, the child view controller should add its own elements to accessibilityElements array, as well as elements exposed by its parent view controller that you want to be considered as part of the accessibility hierarchy. This will essentially allow the child view controller to have the same accessibility hierarchy as its parent. Remember: the order that the elements appear in your accessibilityElements array is the order in which VoiceOver will read them.

You may note the murkiness of this design — the child view controller typically ought not to be made aware of properties of its parents in order to remain modular. It is an unfortunate byproduct of the current limitations of UIAccessibility. Thankfully, it should not affect your UITest’s accessibility hierarchy, as you should typically have access to your app’s entire hierarchy. However, the goal in mentioning this property is to combat situations where UI testing development may come at the cost of your app’s accessibility.

In summary, any element you wish to query for a UI test must be given an accessibilityLabel and/or an accessibilityIdentifier, with isAccessibilityElement set to true. There are then different situations where you will want to expose or hide your elements to accessibility in order to ensure the best accessibility experience, as well as further customizing your UI elements’ actions and behaviors. Above all, the most UI-testable app in iOS is an accessible one.

XCUITest

Setting up a UI test in XCode is easy. You can either choose to “Include UI Tests” when you first create your project, or you can add a UI testing target from the file menu in a preexisting project. Once you have named said target, your first UITest file will be generated, named [YourAppName]UITests.swift (or [YourAppName]UITests.m, if you’re into that).

This is not a primer on how to set up UI/Unit Tests, so I will skip most of the fundamentals of setUp or tearDown. However, you will notice that your UI test file’s setUp method contains a default implementation, where self.continueAfterFailure is set to false, and has the line XCUIApplication().launch, which launches the app for the test runner to be able to run the tests. These typically should not change, though there are several tricks that you can do in setUp to make your tests more flexible, and friendly for continuous integration, as we do here at Gannett. We will cover some of these later.

From your XCode file menu, select New -> Target, and select iOS UI Testing Bundle

XCUIElement

The primary two classes you will be working with are XCUIElement, and XCUIElementQuery. Even the main App object, XCUIApplication, is a subclass of XCUIElement.

Any XCUIElement can be queried for elements, and by querying a XCUIApplication instance, you can query the entire accessibility hierarchy for the app’s current state.

XCUIElements can have many different types, which roughly correlate 1-to-1 with its corresponding UIKit class. That is, a UIButton in your app is interpreted as XCUIElement.ElementType.button in your UITest. A full list of XCUIElement types can be found here, but here are the ones you will be working with most frequently:

  • .button (UIButton)
  • .staticText (UILabel)
  • .other (UIView)… perhaps the most unintuitive one
  • .cell (UICollectionViewCell or UITableViewCell)

This might lead you to the sneaking (and true) suspicion that none of your UIKit elements exist as UIKit elements in a XCUITest, they only exist as XCUIElements. You cannot, as of yet, test for a XCUIElement’s background color, alpha value, or several other UIKit properties. You can however test if an element exists (element.exists) or if the element is hittable (element.isHittable). You can also access its frame, size class, and enabled behavior. You can sometimes “implicitly” test if a button or other control behaves correctly by virtue of the fact that if it you tapped it and the test is not able to find the next element, maybe the control isn’t doing what it’s supposed to.

XCUIElementQuery

Already we have made much mention of querying for XCUIElements. This is done by way of XCUIElementQuery. Your main entry point for querying elements is through your XCUIApplication instance, which can be created by simply doing let app = XCUIApplication(), or even assigning it as a member of your test class so you can just reference self.app without redeclaring it everywhere. Once you have done this, you can make a query according to any of the XCUIElement types (see XCUIElementTypeQueryProvider documentation). For example, query the buttons in the accessibility hierarchy by using self.app.buttons. This will return you a XCUIElementQuery object, which you can then use to search an XCUIElement’s immediate children or entire list of descendants for the element you want.

As I mentioned before, you can query for XCUIElements using the accessibilityLabel and accessibilityIdentifier properties that you set in your code. The beauty of being able to set either of these properties is that you can query either of these properties in the same way. Feel free to look at the XCUIElementQuery documentation for all of the ways that you can query an element by these identifiers, but the shorthand way that you should use most often is as follows:

self.app.buttons[“Next button”]

Each XCUIElementQuery object can be queried via subscript syntax in this manner. What’s more is that the string given to the subscript can be either the accessibilityLabel or the accessibilityIdentifier. XCUIElementQuery will be able to search the accessibility hierarchy for either of these identifiers, and you should only have to worry if the element exists. In certain situations, however, you may have to deal with duplicates, or you may not know exactly what label to query for (such as if you parameterized your accessibilityidentifier as we did in our collection view and table view cells). Here are some alternative ways of querying that might come in handy in those cases:

  • While your app should ideally never have multiple UI elements with the same accessibilityLabel or accessibilityIdentifer, in cases where this might happen, you should use self.app.matching(.buttons, identifier: "Identifier").element(boundBy: index).
    This method returns the buttons in the app with the label/identifer “Identifer”, at the given index. Here, the zero-indexed variable index can be any value that you want, as long as it is not greater than the number of elements in the query. Giving 0 as a value for the index would return the XCUIElement that is the highest (and usually furthest up on the screen) in the accessibility hierarchy.
  • self.app.buttons.matching(predicate).element(boundBy: index) gives the button XCUIElements that match the variable predicate, which is a NSPredicate object. This predicate can be of the form “label BEGINSWITH cell” or “label CONTAINS button”; whatever you choose, your query will return a XCUIElementQuery object with elements matching the given predicate. Once again, .element(boundBy: index) will give you a single element within that query, assuming index is less than the number of elements contained in the query.

One final note I’d like to make in this section is regarding XCode’s UI Test recorder. While it is ostensibly functional, the code that it outputs is very clunky, unintuitive, and difficult to read. It may be a good tool to use to begin writing simple tests, but I would highly recommend learning these more concise ways of querying for and interacting in your UI tests.

Interacting with the App in a UI Test

An example of a UI Test in progress.

Once you have retrieved your XCUIElement via your query, you can now interact with it as if you were a user holding a device. Here again is the XCUIElement documentation, where you can find a full list of ways to interact with your elements. However here is a list of the most common interactions we have encountered while testing the USAToday app:

  • .tap(): performs a single-finger tap an element.
  • .swipeLeft(), swipeRight(), swipeUp(), swipeDown(): swipes the element in some direction. The swipe itself is of unknown velocity and pressure, but will suffice in instances where you need to swipe to the next view controller in a UIPageView controller, or the next cell in a paginated UICollectionView.
  • .typeText(String): Once you have tapped a text field and the text cursor appears, call this method to type a string. Then, use the method typeKey(_ key: String, modifierFlags flags: XCUIElement.KeyModifierFlags) to hit a keyboard key to dismiss the keyboard. See XCUIKeyboardKey docs for the full list of keys you can access.
  • .press(forDuration: TimeInterval, thenDragTo: XCUIElement): This method is important because it is how we primarily deal with scrolling a precise amount. If the element you are calling this method on is embedded in a UIScrollView, you can call element.press(forDuration: 0, thenDragTo: endPoint) to scroll to a precise location. This functions like a one-finger drag: if the XCUIElement you give as an argument is above the element you are calling the method on, the “finger” will drag up, and the scroll view will scroll down. You will find the XCUICoordinate class very useful here, and the method element.coordinate(withNormalizedOffset: CGVector). This will give you a XCUICoordinate that is offset by the element’s frame, times x and y values of the input CGVector object that you pass in. In other words, if your CGVector object has a y value of 1, it will scroll the entire height of the element.
    It’s important to note that you cannot create a XCUICoordinate without a XCUIElement. XCUICoordinates are defined in relation to XCUIElements, so you must use a XCUIElement method to generate the coordinate. However, because the self.app variable we define is a XCUIApplication instance, you can effectively access the entire window using this instance, and can use .coordinate(withNormalizedOffset:) deftly enough to access wherever you need to on the screen.
  • .waitForExistence(timeout: TimeInterval) -> Bool: This method can be used when you expect for an element to appear on the screen but needs to wait for something like an animation, or a video ad, or simply because of load time. This property was introduced in XCode 9, though we have used API similar to this to test features that involve waiting through video ads.
  • addUIInterruptionMonitor(withDescription: String, handler: @escaping (XCUIElement) -> Bool): This method is used for dealing with iOS’s system alerts, including alerts for allowing location or notifications. Be sure to set these prior to any actions that would summon a system prompt, or to be safe, set it before calling self.app.launch(). In the method’s handler, query for the alert’s button that you want to press, and then tap it to dismiss the alert.

Testing in a UI Test

You can run a UI test by either pressing Command+U (this will run all of the targets that you have configured to run for your scheme’s Test configuration), or by navigating to the Test navigator, the fourth chiclet from the right in the XCode left panel, and then click the play button that appears when you hover your mouse over a test target, file, or method in that view.

Tap the play button to run all of the tests included in that target, file, or test method.

While navigating the app, it is also possible to test properties of these XCUIElements as well. Most commonly we test for the existence and hittability of our UI elements, using basic XCTest asserts. You can read about the full list of XCTest asserts in the “Test Assertions” section here, we will not cover them in this UI testing guide.

Just know that you can assert an element’s existence or hittability by calling XCTAssertTrue(element.exists) or XCTAssertTrue(element.isHittable). You can make assertions using any of XCUIElement or XCUIElementQuery properties, so long as they evaluate to a boolean expression.

Applitools

Where XCTest assertions have not sufficed, whether that be because of inconsistency during XCUITest’s infancy, or because of erratic behavior by the simulator or machine running the tests, we have used Applitools to supplement our UI testing coverage. Applitools is a third-party visual testing vendor that performs pixel-by-pixel examinations of screenshots we upload to their servers, ensuring visually that our app is in a state that we expect it to be, and alerting us to any differences.

We access Applitools through the XCUIEyes framework that is embedded into our UI testing target via CocoaPods, and simply create a XCUIEyes instance in code. When we want to take a screenshot, we call the method [self.eyes checkWindowWithTag:@”Screenshot name here”], so long as we are within a pair of [self.eyes open] and [self.eyes close] calls (our Applitools test development has been in Objective-C thus far). Once the test is finished running, you can navigate to your Applitools console in the web browser, and check for how the server determined your test performed.

An example of how your Applitools dashboard may look.

Making Your UI Tests CI-friendly

In order to add additional confidence to our testing cycles, we integrate our UI test suite with our Jenkins CI workflow, running our UI tests every night. We are able to do this not only for USAToday, but for any other of our market schemes that we choose. This is thanks largely to the flexibility we have built into our UI Test configurations: we are able to pass parameters into our UI test based on how our Jenkins job is configured.

Here is an example: every night, we run our UI tests for USAToday on iPhone X, iPad Air, 9.7 inch iPad Pro, and iPhone 8, on the latest iOS version. We also run the same batch of UI tests for Democrat & Chronicle (our Rochester, NY market) for the same list of devices. We do this while being able to pass in our device and market names for logging purposes, and supporting Applitools integration with Jenkins by passing our Applitools batch ID to our UI test.

Obviously for this to be possible in the first place, the two apps need to be similar enough to be able to utilize the same UI tests. However, we are able to configure the environment in which our tests run through a variety of means:

  • XCUIApplication launchEnvironment: the launchEnvironment property of XCUIApplication is a [String:String] dictionary that is provided to the main app via ProcessInfo.processInfo.environment. This way we can set flags telling the main app if it is running in the context of a UI test, and if so, display static content that will make our tests less variable and run with more consistency, for example. We also use it to set a global timestamp for our article cards to appear the same way, to set a static feed so that we can generate the same content every time for our tests, and to disable certain features such as drag-and-drop (because the dragging mechanism to scroll down a scroll view is compromised by drag-and-drop).

User-defined variables: In XCode, select your project file from the file navigator to the left. Select your UITest target from the left list, and select Build Settings in the nav bar at the top. Near the bottom of this screen should be a settings section titled “User-Defined”. Here, you can define variables that can be manipulated by XCode command line tools, and set prior to the test being run. We primarily use these to store strings passed in by our Jenkins jobs that need to be interpreted by our tests so we know what we are testing, such as the market and device being tested. Note that you need to set default values for these in your project file before Jenkins sets them. Here it may be useful to set values so that your tests can differentiate between a remote test triggered by Jenkins versus a local test triggered by a developer.
However, in order to extract these strings from the preprocessor macro, you need to use two #defines in the following way:
#define MACRO_NAME(n) #n
#define MACRO_STRING(n) MACRO_NAME(n)
The first #define leverages the C “stringizing” operator to generate a C string out of the macro value. The second #define is a necessary layer of indirection… I have not understood why yet.
After this, you can use String(format: "%s", MACRO_STRING(MACRO)) to convert your macro value into a Swift string.

  • xcodebuild: This command is provided to you when you install Command Line Tools for XCode. This command allows you to build, clean, or more importantly in this case, test your apps. If you run xcodebuild test, after specifying the scheme, project/workspace file, and a whole host of other parameters, you can run your UI tests from the command line, or via Jenkins/other CI. You can even narrow down which portion of the test suite you want to run by providing -only-testing:UITests/Tests/Test01. Then, you can set values for the user-defined variables you previously defined in your project file, inline in the same command. Your whole xcodebuild command might look something like this:

xcodebuild test -workspace App.xcworkspace scheme “$SCHEME” -configuration Debug -destination platform=”iOS Simulator”,name=”$DEVICE”,OS=$OS -only-testing:UITests/Tests/test01 DEVICE_NAME=”${DEVICE}” BATCH_ID=${APPLITOOLS_BATCH_ID} MARKET=${MARKET}

Here, we are testing the app within the App.xcworkspace file at the given scheme, on some device with some OS, testing the UI test named “test01”. Then, we set values for our user-defined variables named “DEVICE_NAME”, “BATCH_ID”, and “MARKET”, and we can subsequently see their values being used at the test’s conclusion. You will see a lot of output as xcodebuild builds your app for testing, after which you should see your test’s results displayed on the screen. Jenkins can then interpret your job to be stable, or unstable, if any of the tests failed.

Our local UI testing script summons the iOS simulator, then proceeds to build the project using xcodebuild.

Debugging

When either your CI server or XCode has failed your test, if you’ve been watching your tests run the whole time and can see exactly what happened, great. Even if this is the case, the error that XCode gives for your test failure might not be completely self-explanatory. Here are some steps you can take toward debugging.

Finding your test’s snapshot

After a test run has completed, you will find your app’s latest runs in the rightmost chiclet menu item in the left XCode panel. Click on any of the rows labeled “Test”, and you should see a long list of steps that your app took while performing its test, including searching for elements, waiting for the app to idle, and performing interactions. At the bottom of this list should be the step your test failed to perform, highlighted in red. You may need to expand the steps further by clicking the drop-down arrows, until you see a block of text highlighted in red, with explanatory failure text and hopefully an accessibility hierarchy. Just above this may be a line that reads “Automatic Screenshot” — if you click the small eye icon that appears as you mouse over the text, you will see a screenshot of what the app looked like when the action failed to happen. This way, you may not even have to be watching your UI Test as it runs (though of course, you should).

I was unable to tap a button because of a ratings prompt!

The Accessibility Inspector

Use the Accessibility inspector to look at what your UI elements’ accessibility values are before you even need to run the test. Use this to make sure that the accessibilityLabel you assigned for your UI element is being found, or if the element is being considered an accessibility element at all. However, you’ll note that you can’t see the accessibilityIdentifier on this view. Nonetheless it is very useful especially to determine if your element is visible to accessibility in the first place. If you find you can’t highlight the view you’re looking for, chances are the superview is intercepting the accessibility focus, and you need to set its isAccessibilityElement to false.

The Console (lldb)

Breakpoints and print statements are as useful as ever in a UI Test. You can set breakpoints in both your UI test or in your app code, and XCode will pause execution for any/all of them. One thing to note, however, is that because XCUITest runs in a separate process from the app, if XCode is paused on a breakpoint in your UI Test, you can still interact with the app on your own (good for screwing up the state of your test…).

Once you are stopped on a breakpoint, you can invoke po statements in your XCode console to print the accessibility hierarchy by simply doing po self.app, or print a property for a particular element that you already have, such as po button.exists. Note that when you print any element the console will print the list of elements that was queried to find that element. When the element was unsuccessfully queried, the lowest output block will have an empty list of elements.

Being able to print the accessibility hierarchy at any point lets you determine which elements are available to accessibility, with more detail than the Accessibility Inspector. You can see each element’s label, frame, properties, and you can print out their .exists or .isHittable values if needed.

By doing “po self.app”, I am able to see the entire accessibility hierarchy of my app in its current state.

Common Pitfalls/Other Tricks

Sleep

Above all, I want to stress that even after all this time, UI Testing in XCode seems imperfect. While we may have come a long way, there are many inexplicable inconsistencies in the app where some elements may be untappable or not exist in the accessibility hierarchy for reasons unknown. In a lot of these cases, the easy, but painfully resigned fix is often simply:

sleep(1.0)

A lot of times, the app just needs some additional time for animations to complete or for accessibility to update. If your app is heavy in background operations or animations, there is a chance you may need to sleep first.

Categories

A lot of navigation within the USAToday iOS app is repetitive can be refactored into reusable routines. In our case, we decided to make a XCUIApplication category to extend its functionality to do things like press all of the back buttons until there aren’t any, to scroll down an entire page length, to search down a scroll view until it finds a certain element, or to navigate to some screen that we know exists.

By doing this, we strive toward an ideal where developers who write tests have to write as little of the logic for finding XCUIElements as possible. Rather, developers can leverage the common actions, only querying for elements when absolutely necessary. This should ideally lead to more developers getting involved in UI testing, as they become quicker and easier to write.

Don’t know which scroll view to use? Use self.app!

I mentioned that XCUIApplication inherits from XCUIElement. Therefore it can be interacted with like any other XCUIElement. Where this is useful is when you are faced with a full-screen scroll view that has no reason of its own to have an accessibilityLabel or accessibilityIdentifier. When this is the case, you can do self.app.press(forDuration: 0, thenDragTo: endCoord) or self.app.swipeLeft()/.swipeRight() and arbitrarily perform a scroll or swipe from the center of the screen. It will save you loads of time from querying for a scroll view that is behind many layers of cells and other subviews.

Conclusion

UI testing follows many of the perceptions that Unit tests are imbued with: they can be cumbersome to write, and often times you may spend more time writing tests for a feature than you did writing the feature itself.

If nothing else, this post was to assure you that given the right foundations and setup in the code, writing UI test code should be systematic and easy. The more you flesh out your UI test common actions and procedures, the easier it should be for developers to write tests. The more familiar you are with the querying style and actions to perform on an element, the quicker it should be to translate how you physically use the app to a UI Test.

With XCUITest, we can be quite expressive in how we query for elements and interact with them. With the right configurations, we can simplify the variability in our tests and derive value from them which will shore up our confidence in the apps we build.

--

--