How we overhauled VoiceOver accessibility in our iOS applications

Stephen Brown
Reach Product Development
8 min readOct 9, 2020

In the UK alone, there are almost 2 million people living with sight loss and countless more who require assistive technology to interact with the digital world.

With that in mind, the Apps team decided it was time to review our iOS applications to check we were proud of the product and ensure it was inclusive for everybody.

Sadly, we found we had dropped the ball in some areas.

What had we done wrong?

Our apps were usable with VoiceOver, for the most part, but some of our more complex layouts and interactions did not translate well to voice navigation. Context was often lost and ideas that could be conveyed easily using icons were muddled into information that had no relation.

Below is an example of a teaser from our article lists.

VoiceOver would read through each element in this teaser in the order they are displayed on the screen, resulting in the following being read:

  • Coronavirus
  • Scotland bans pubs selling alcohol indoors for 16…
  • Restrictions are even brighter in the five areas…
  • Comment icon
  • Bookmark icon
  • 2h

This is problematic for two reasons:

First, it takes a user six swipes to get the context of an article before even deciding to read it or move on to the next article.

Secondly, most of this is not relevant or makes no sense, and the user may not even be aware that they are swiping through an article.

‘2h’, for example, is supposed to tell the user that the article was published two hours ago. This becomes worse if the article was published ‘2m’ ago, which is read as ‘2 meters’.

It didn’t take long for us to decide we needed to make some wholesale changes to how our applications are interpreted by VoiceOver.

The tools at your disposal

Luckily for us, Apple built UIKit with accessibility in mind and provides some great tools with the UIAccessibility protocol.

There are six key features within UIAccessibility that we took advantage of when redesigning our VoiceOver experience.

isAccessibilityElement is a boolean variable that tells iOS whether the element can be accessed by assistive applications. It is great for hiding text or icons that provide no information or are not vital to conveying the context of the situation.

accessibilityLabel is a string that describes the element. It will be read by VoiceOver for its element. These are the backbone of VoiceOver and should describe the element (e.g. ‘Back Button’) or provide the content of a label.

accessibilityHint is a string that provides a description of the result of interacting with the element and is read by VoiceOver. It is great for providing context to buttons e.g. ‘Opens the article detail’ or ‘Bookmarks the article’.

accessibilityTraits is an array of traits for the element. They describe what the element is and what capabilities it has, such as a button, an image, or a link. Proper use of traits is critical as it tells VoiceOver how the user can interact with an element.

accessibilityElements is an array that belongs to a UIView and can be set to specify which elements should be included for accessibility, as well as the order they should be iterated over. This array is particularly useful for ensuring the order of your elements provides the right context.

UIAccessibility notifications provide you with a mechanism to notify UIAccessibility that you have made changes to a view. This is incredibly important if your application loads anything dynamically.

UIAccessibility extension for notifying of layout changes

The .screenChanged notification gives you the option to supply an object to focus on and should be used when “a new view appears that takes up a major portion of the screen”.

We use this in scenarios where a teaser list loads dynamically, or when we present a screen such as our text resize modal. It allows us to specify where VoiceOver should focus and ensures the user starts in the correct position.

Testing

There are two ways to test your current implementation and any improvements that you make.

Accessibility Inspector

When debugging with the simulator, Xcode provides the Accessibility Inspector, which enables you to cycle through the accessibility elements on your page and view debug information. It can be accessed most easily by opening Spotlight and searching for ‘Accessibility Inspector’.

Accessibility Inspector — Inspecting an article headline

This is a great tool for quickly debugging how VoiceOver or any screen reader will interpret your application. You can cycle through elements on your page and see what accessibility values are set for each one.

VoiceOver On Device

Once you are ready to test on a real device, there is no alternative to using VoiceOver and interacting with your application the way a real user would.

VoiceOver can be activated on your device by going to Settings > Accessibility > VoiceOver and enabling it.

Alternatively, for ease of testing with iOS 14, you can add it as a back tap option, which means you can turn it on and off easily while navigating to your desired application screen manually. VoiceOver can be a little tricky to use at first, but Apple has a great starter guide on the available interactions.

What we did

We employed a couple of different approaches to ensure the user experience of a screen was conducive to achieving its desired outcome.

Teaser list

The main purpose of the teaser list is to provide a list of articles and allow the user to pick the ones that appear interesting so they can read more on the article detail page.

We decided to make each teaser a single accessibility element that provides the relevant context and enables the user to make an educated decision on whether to proceed and read that article or go to the next teaser.

Set accessibility function for a teaser cell

Now each article is succinctly described in a single swipe action based on the headline, which results in the following being read:

‘Article. Scotland bans pubs selling alcohol indoors for 16… Published on Wednesday 7th October 2020. Opens Article Detail’

This is a massive improvement. It enables the user to quickly work through our teaser list to find an article they want to read, and tells them how to access it.

Articles

Where the teaser list should quickly provide the user with enough information to decide whether to read an article, the article detail page itself should provide all of the article information, as well as a clear set of interactions.

So rather than simplifying the available elements, we worked to add more context to them. Below are some examples of how we did that.

Images

UIImageView provides good accessibility out of the box by providing the image trait out of the box. We needed to ensure we were providing the alt text to the user so they could understand the meaning of the image.

Set accessibility function for an article’s lead image

This tells the user that they are focusing on an image, the alt text is read explaining what the image is conveying, and it notifies them they can tap the image to open it, providing pinch and zoom capability.

Dates

When dealing with news articles, the published date is incredibly important information to convey to the user to ensure relevance.

Previously our dates would be read out as they were seen on the screen. For example ‘8/10/2020’ would be read as ‘eight slash ten slash twenty twenty’. This is something that is simple enough to pick up visually, but when read aloud it can be quite confusing to understand what it represents.

Instead of allowing the built-in UILabel accessibility to handle this, we again added context.

Set accessibility function for an article’s published date

We started by specifying that this is when the article was published. We then extended our date implementation to provide an accessible version, which is much more verbose. The published date label is now read as ‘Published on the 8th of October, 2020’.

Article type

We have a number of different article types including news, opinion, and live events. These are all communicated visually within the article detail page.

The article type can provide important contextual information about how the article should be perceived and interacted with, so it is important for us to add this information for VoiceOver.

With this in mind, we added the article type to the beginning of the article headline.

Set accessibility function for an article’s headline label

Now the article will begin by stating the type of article, then reading the headline, e.g. ‘Opinion Article, What Man United and Liverpool proved in dismal thrashings by Spurs and Villa’.

Discover

The Discover page is used to personalise the news topics that appear in the user’s My News section. Topics can be added and removed, and followed authors and tags can be managed from here.

Discover page of the Mirror app

This screen was problematic due to the complex difference between our iPhone and iPad layouts, which ultimately resulted in an unusable experience.

Each news topic is tappable and opens a preview of articles, and each topic can be added or removed from the My News navigation by using the ‘plus’ or ‘tick’ buttons.

Set accessibility function for discover section cells

To improve this experience we again added context.

For the section name labels, we provided the button trait and an accessibility hint to make it clear that tapping this will open that topic.

For the favourite buttons, we update their label to explicitly state the result of interacting with the button.

With these improvements, the user is now able to cycle through the Discover screen and understand how they can interact with the elements, as well as understand their current state.

Next steps

Testing

After implementing these improvements we ran a round of internal testing to verify what we had done worked and provided a usable experience.

The obvious flaw is that we are not the end users of this feature.

With that in mind, we have scheduled a round of user research testing, with the aim of putting our improvements in front of app users who will be using these features in the real world, so we can gather feedback and keep iterating on what we have done.

Other Accessibility

Now that we are in a more comfortable place with VoiceOver accessibility, we can move into other areas.

The next focus will be looking at how we can improve the use of fonts and colours across the app to provide an improved reading experience.

--

--

Stephen Brown
Reach Product Development

iOS Developer at Reach PLC, building apps for @DailyMirror and other titles. Follow me on Twitter: @IAmStephenBrown