Lessons in iOS Voiceover Accessibility

How we broke our app for blind users, and what we did to fix it

Photo by Jason Rosewell on Unsplash

When our founder & CEO Ethan came up with the idea for Aaptiv, he decided to go with audio-only because (1) he didn’t think people should be glued to a screen while working out, and (2) audio workouts are cheaper to produce than video. What we didn’t foresee was that Aaptiv’s audio-only approach to fitness would become a major hit with the blind community.

In fact, it was only after we released a version of the app with a redesigned homepage that we learned just how many highly engaged blind users we had. Overnight, we received hundreds of emails and tweets from blind Aaptiv members telling us that the voiceover functionality in the app was completely broken, and they could no longer navigate to the workouts.

Left: Home screen prior to redesign, where voiceover read aloud every workout category. Right: Redesigned home screen, where voiceover only said section titles “Recommended for You” and “Browse all Categories”.

We were caught off-guard by the outcry and immediately felt terrible; for many of our blind users, Aaptiv is a critical part of their fitness routine. And so, we set about familiarizing ourselves with how voiceover works in order to fix the bug.

We started by going into the Apple Settings menu, navigating to General, then Accessibility, and turning on Voiceover. From there, we read the brief description on the voiceover gestures (see screenshot), and set about making each workout category link detectable in voiceover.

Warning: This is not how blind users actually use Voiceover.

We quickly realized that introducing the horizontal collection of categories into our otherwise vanilla tableView threw the inadvertent wrench in accessibility. For our first attempt at a fix, we approached our tableViewCell containing a collectionView simply with:

horizontalSectionCell.isAccessibilityElement = false
horizontalSectionCell.collectionView.isAccessibilityElement = true

When tested, Voiceover correctly read aloud the workout category names after tapping once, and linked through to lists of workouts when tapping twice. Three-finger swiping up and down allowed for successful scrolling. However, when we rolled out the fix, our blind users continued to report that Voiceover was not working. After several rounds of back-and-forth with our customer service team and community manager, we jumped on a video conference with them to get to the root of the problem.

Specifically, when we asked our blind users to describe in painstaking detail what they do after opening our app, every single one of them mentioned flicking left and right to “see” the next object on the page, and double-tapping to “click” on it. Meanwhile, our claim that we had fixed “triple finger scrolling” and “single taps” was met with confusion. “Umm… we never scroll with three fingers,” they replied.

Armed with a better understanding of how our blind members actually use Voiceover, we set about fixing the feature for the second time. We noticed that when flicking left and right, accessibility did not traverse through the cells in our horizontal section. It focused only on the collectionView as an element. We realized we had to dig deeper inside our section cell. We made it into, and out of, our section successfully with the following:

horizontalSectionCell.isAccessibilityElement = true
horizontalSectionCell.accessibilityElements = [titleLabel, collectionView]

Also, the voice-over content was not describing each cell completely. We had to move the accessibility control to collectionViewCell granularity. Then for each cell in our horizontalSectionCell the following did the trick:

cell.isAccessibilityElement = true
cell.accessibilityLabel = accessibilityLabel(forItem: item)

The last line here returns a variable collection of comma-delimited text to be read by voice-over including: title, subtitle, status (‘in progress’), etc. Now our flick behavior traversed the entire horizontal section successfully and then continued its journey down the tableView.

We released this second fix, and our blind users were delighted that Voiceover was finally working again (though understandably exasperated that it took us so long). Then, with the crisis at hand resolved, we set about exploring improvements to our QA process to catch future voiceover bugs, and revisited a number of upcoming features to make sure they accounted for voiceover functionality.

Specifically, we implemented a number of standard practices as part of our product development process:

  • Always be testing: Before our home screen snafu, we didn’t include voiceover QA as part of our normal testing process. Afterward, we made sure to include voiceover testing as a QA step in every feature launch, especially when adding new screens to the app.
  • Annotate images and videos: A number of our upcoming feature designs include image-only modules and/or silent videos. We reworked the requirements for these features to include customer-facing titles and descriptions so that blind customers could still understand what we’re trying to communicate.
  • When in doubt, check with customers: Our initial video conference with our blind users was so enlightening because they showed us how they actually used voiceover. When we started building keyword search, we asked one of our blind users to demonstrate a text-based search using another iOS app — a far more effective lesson compared to simply reading developer docs.

Ultimately, our commitment to making sure voiceover works comes back to our company mission. We believe that everyone deserves an affordable solution to help them feel healthy, strong, and confident about their bodies — whether they can see or not.


Co-authored with Brian Maci and edited by Kathleen Yanolatos. We’re hiring at Aaptiv! To see what jobs we have available, check out our Careers page.