iOS Accessibility

Adrian Russell
Just Eat Takeaway-tech
11 min readApr 26, 2021

--

While many put a large amount of thought into designing their applications to be as visually pleasing as possible, the same can not always be said for making them accessible to those with particular needs.

A truly beautifully crafted platform should not only look good but must also be a pleasure for all to use. Over the years, our iOS applications have been constantly updated and fettled to produce the best visual experience for our users. The experience of users using Voice Over, however, had often been overlooked and had fallen into a sorry state.

App Store reviews and user complaints have made it abundantly clear that we must finally address this issue and make our applications usable with Voice Over. The initial push was made. To understand what needed to be corrected, we needed to understand how Voice Over works and audit our applications to find the extent of the issues.

How Voice Over works

Voice Over is the built-in iOS screen reader. Rather than having to tap directly on interface elements, users employ on-screen swiping gestures to navigate around interface elements. A single element can be focused upon at a time and Voice Over will read out the details of that element. Not every view is automatically exposed to Voice Over. Rather, views must be explicitly marked as being accessible and have particular attributes set to determine how they are announced and how a user can interact with them.

Accessible elements are identified by a few simple attributes specified by the UIAccessibility protocol:

  • a set of traits that informs Voice Over what the element does and how to describe it
  • a label that identifies the purpose of the element to the user
  • an optional value, for elements that have a changeable value, such as a slider
  • an optional hint which will give the user a better context of the element

For example, suppose you have a custom slider view that is being used to set the font size of the text in a document. It could have the following attributes:

let slider = CustomSlider()
slider.isAccessibilityElement = true
slider.accessibilityTraits = .adjustable
slider.accessibilityLabel = “Font size”
slider.accessibilityValue = “14 point”
slider.accessibilityHint = “Only affects the selected text.”

Voice over would read out this slider as: “Font size. 14 point. Adjustable, swipe up or down with one finger to adjust the value. Only affects the selected text.”

The Audit

Now that we understood the basics of how Voice Over worked, we needed to do something we almost never did; use our applications with it. So we turned on Screen Curtain (so we couldn’t cheat and look at our devices) and embarked on an audit of all parts of our applications. Over the course of several dozen hours, we tried to work our way through every possible user journey to discover which ones worked, which worked but were too difficult to follow without being able to see the screen, and which were simply impossible to perform. It was a confusing and often frustrating experience but, in the end, we had an understanding of the obstacles facing some of our users. Ultimately, we identified over one hundred issues that needed to be addressed.

Example issues

Let’s have a look at a few of the issues we uncovered and how we fixed them.

Confusing label content

We have a number of places where we are showing dates. These dates are usually shown in the format dd/MM/yyyy. This is a very common format and it is perfectly clear to users that 17/07/2020 means the “seventeenth of July of the year 2020”.

This breaks down for users navigating with Voice Over. 17/07/2020 will be read out as “one seven slash zero seven slash two thousand and twenty”. This is confusing to anyone listening to it but, thankfully, there is an easy way to rectify the situation.

All accessible elements have an accessibilityLabel attribute. If this attribute is set for a label it will be read out by Voice Over instead of the displayed value. Knowing this we can create a string that contains a fully readable version of the date. Using DateFormatter.localizedString(from: <date>, dateStyle: .long, timeStyle: .none) with the above date which will produce the output “The seventeenth of July, 2020” which Voice Over will happily readout.

let date: Date// the date to be shown in the label
let visualDateFormatter: NSDateFormatter// the date format which produces the date to be shown in the label
let label = UILabel()
label.text = visualDateFormatter.format(date)
label.accessibilityLabel = DateFormatter.localizedString(from: date, dateStyle: .long, timeStyle: .none)

The modal dialogs don’t block the content below

Our apps use a custom modal dialog in a number of places. When shown they cover all other content and prevent any other interaction until they are dismissed.

When our custom modal dialogs were presented, Voice Over would still allow navigation through all views behind it.

The reason came down to how we presented those dialogs. We showed them in their own UIWindow displayed above all other content. Voice Over was treating that window as just a sibling view of all the other content on the screen.

Effectively, it was just appended at the end of all the content over which it is displayed. The result is that Voice Over will read out all the content behind it, allow interaction, and then read out the content of the dialog.

We can fix this setting the accessibilityViewIsModal property of the dialogs window. This will tell Voice Over that the window is being displayed modally, and that it should ignore all of its sibling views. The end result is that only the content of the modal dialog will be read out by Voice Over.

Logically grouped items are not grouped

We have numerous places within the app where multiple views are composed together to make a visually grouped element. However, Voice Over will read them out separately, creating quite a dissonant experience.

In the above example, it appears as a single button with two lines of text.
Voice Over treated it as three distinct elements and read them out as

  • “Button” (note that there is no label, it is just a button)
  • “Review your order”
  • “Help others find great restaurants”

To fix this we need to group the elements together as a single element.

view.isAccessibilityElement = true
view.shouldGroupAccessibilityChildren = true
view.accessibilityTraits = .button
view.accessibilityLabel = “Review your order”
view.accessibilityHint = “Help others find great restaurants”

Voice Over will now treat this as a single accessible element and will read out “Review your order. Button. Help others find great restaurants” — much clearer to the user than before.

Poorly labeled icon buttons

We have a number of buttons that simply show an icon rather than text. Where possible, Voice Over will make an attempt to hint to the user about what the button’s action could be. For example, we have ‘close’ buttons which show a cross icon. Voice Over would read out “Cross. Button. Possibly Close” which is helpful, but not the best experience for the user. We discovered that we had not been properly setting the accessibilityLabel for those buttons; doing so corrected the issue and improved the usability.

Menu navigation was unreliable and would skip and jump items

Our menu pages are based upon table views. This gives us a lot of useful behaviour such as performant scrolling and simple management of list items.

Within our table view cells, we embed a view that contains all other views. This gives us the flexibility, if we wish, to switch to using collection views, or to manage to display the views manually.

This, however, presented a serious problem for Voice Over. As our accessible elements were contained within a view within the cell, the cells are marked as inaccessible elements as their subviews need to be accessible. This is fine for cells that are already being shown as, being in the view hierarchy, their accessible subviews are present to be seen by Voice Over. For cells that are yet to be shown, and not in the view hierarchy, Voice Over is not aware of them and will not load the next cells to check that they contain as the cell itself is marked as not accessible. This all results in an unfortunate behaviour; scrolling down through the accessible elements will go through all the cells that are visible on the display (or loaded in the view hierarchy) but when it reaches the bottom of those cells it will simply move to the tab bar and never load any of the other cells. Interestingly for scrolling back up the behaviour is the reverse; it will jump to the final cell in the table view, work its way through all cells that fit on-screen above that cell, and then jump back up to the top missing all the middle cells.

We clearly have a problem as, for longer lists, items in the middle may never be read by Voice Over and users would have no idea they exist.

The eventual fix for this was to adjust what elements we were allowing to be accessible to the users. Instead of the tableview cells being marked as isAccessibilityElement = false and having the inner content view within the cell as the accessible element, we set the cell to be isAccessibilityElement = true and set the accessibilityLabel for the cell with a string to describe the contents of that inner view.

This is enough for Voice Over to understand that cells should be accessible and correctly navigate through all of them in the table.

Inaccessible controls

Just like our menu pages, our item selection pages are based upon table views. For complex items, such as Subway sandwiches, various extras can be added. When an item is added, a trash can button appears on the items so they can be removed.

With the above fix, those buttons could not be accessed to be pressed. The way we allowed access to this functionality was by using a UIAccessibilityCustomAction.

func configureAccessibilityActions() {
let removeAction = UIAccessibilityCustomAction(name: “Remove item”,
target: self,
selector: #selector(remove(_:)))
accessibilityCustomActions = [ removeAction ]
}
@objc func remove(_ sender: UIAccessibilityCustomAction) -> Bool {
<< REMOVE ITEM CODE >>
return true
}

Inaccessible custom controls

We have a custom control used to draw stars to show the rating for a restaurant and allows a user to enter one. The view inherits from UIControl but contains no labels or any subviews which are accessible by default. As a result, the view was totally inaccessible and would be ignored by Voice Over, the user would never even know it exists.

We need this control to be both accessible to read out the displayed value and also allow a rating to be selected by the user.

view.isAccessibilityElement = true
view.accessibilityTraits = .adjustable
view.accessibilityLabel = “star rating”
view.accessibilityValue = “\(value) of \(maxValue) stars”

Controls like UIKit’s UISlider and our rating view can take advantage of the .adjustable accessibility trait which will provide the functionality for users to adjust the control with Voice Over by swiping up and down.

We need to respond to the user swiping to adjust the value. This is done by overwriting the following methods.

override public func accessibilityIncrement() {
self.rating += 1
sendActions(for: .valueChanged)
}
override public func accessibilityDecrement() {
self.rating -= 1
sendActions(for: .valueChanged)
}

Now when our star rating is focused, Voice Over will read out along the lines of star rating. “5 of 6 stars. Adjustable, swipe up or down with one finger to adjust the value.”.

Wider Voice Over usability considerations

We must think beyond just making sure our standard user flow can be experienced through Voice Over. Sometimes, we must adapt that flow to account for Voice Over in order to provide a much richer experience.

A good example of this is our ephemeral message views which are shown when an item is removed from the users basket. They will appear modally at the bottom of the screen for a few seconds before disappearing again. A user can tap on it to undo removing the item.

When shown, Voice Over takes time to focus upon that element and read it out. Unless the user is very quick to interact, the modal dialog will disappear and the focus will return to a different element. This can be a very confusing experience and not one we want to subject users to.

The solution is to change this behaviour only when Voice Over is enabled on a device. Instead of showing the ephemeral message view, we instead show a UIAlertController and include an explicit action to dismiss it and another to allow the action to be undone.

Voice Control considerations

iOS 13 introduced Voice Control allowing a user to control the device with speech.

The following cell has its accessibilityLabel property set to “Meal — Large. £9.69”. to select this item with Voice Control the user would have to read out that whole text, including the price.

This is not the best experience for a user, but we do not want to change the accessibilityLabel to remove the price as we need it for Voice Over. The solution is to use the accessibilityUserInputLabels property of the cell.

let cell: UITableViewCell
cell.accessibilityLabel = “Meal — Large. £9.69”
cell.accessibilityUserInputLabels = [ “Meal — Large” ]

This will keep the Voice Over experience but also allows Voice Control users to just speak the dish name to select it.

Culture change, & the road ahead

After addressing all the issues discovered in our audit, we now have an application that will work with Voice Over and Voice Control to a relatively high standard. We also have a much clearer understanding of the different ways in which people can interact with our applications. Subsequent feedback from users has shown that they are now having a more successful and less confusing experience using our applications.

It is, however, vitally important to ensure we do not slip back into lazy habits and forget about accessibility when adding or rewriting features.

We have decided the best way to avoid this is by shifting our team culture to keep accessibility as a primary focus of our user experience. During our fortnightly chapter meetings when we show everyone what new features we are working upon, we have now mandated, where applicable, to include a demonstration of features working with Voice Over. We now do not consider a feature complete and ready for release until it is shown to be accessible to all users.

Although we have made great strides forward, there is still much work to be done. We have incorporated our focus to push Voice Over standards even higher, but we must maintain that momentum to keep pushing to adopt other accessibility functionality provided by iOS, such as dynamic content sizing.

If we want everyone to use and love our platform, we must make it easy to be used and loved by everyone.

Have you checked our open vacancies?

--

--