Introduction to Accessibility on iOS

Sonnie Hiles
Wise Engineering
Published in
6 min readApr 29, 2021

I’m a firm believer that apps should be designed and developed inclusively, with a fantastic user experience everyone can enjoy. An accessible product empowers users with disabilities, giving them independence where they’d otherwise need help.

Apple offers a wide range of assistive technologies that apps can harness within iOS, all of which fall into the following four categories.

We usually think of users who are permanently disabled when discussing these assistive technologies. However, in their Inclusive Design Toolkit, Microsoft highlights that disability can also be either temporary or situational. They give the following examples where this could be the case for mobility:

  • You may temporarily lose the use of your dominant hand if you break your arm and need to wear a cast.
  • A new parent might not have the use of both their hands while holding and soothing their newborn.

Therefore, when apps are designed to work for someone with a full-time disability, these benefits can also affect these temporary or situational users. For example, it’s estimated that in the United States, 26,000 people a year suffer from loss of their upper extremities. That’s a lot of people, but when you also include temporary and situational disabilities, that number balloons to 21 million!

In many organisations, accessibility can become an afterthought, but this shows just how impactful first-class support for these technologies can be.

In this introduction to accessibility, I’ll focus mainly on vision, as these are the most popular assistive features on iOS. I’ll also touch on how these same technologies can benefit users with mobility and cognitive disabilities too.

Below is an approximation of how users with differing levels of eyesight might experience an app. There are four different categories of user:

  • Full Vision — This is how most users will experience the app; they’re able to read the content exactly as is.
  • Partial Vision — The app’s structure is clear. The content is almost readable but extremely blurry, creating a partly usable but poor user experience.
  • Low Vision — Users can make out the high-level structure of the app but wouldn’t be able to use the app as the content isn’t visible.
  • No Vision — Users can’t see the app, making the visual interface meaningless to them.

Dynamic Type

Dynamic type is an assistive technology that allows users to choose their preferred font size. It provides a wide range of values, from smaller than the default to huge accessibility sizes. When the font size changes, views grow or shrink dynamically to accommodate the text, sometimes with alternative layouts. Below is an example of the default font size next to the XXL accessibility size.

It’s by far the most popular assistive technology on iOS, 28% of customers of the Wise app have a non-standard font size set. That’s huge 🤯!

Dynamic type has other benefits, it’s also commonly used by users who struggle with precise gestures. When the font is scaled up, so are the on-screen elements. This has the side-effect of increasing the size of the touch targets of all the elements. In the screenshots above, look at the size of the ‌”Add” button in the default size vs the XXL font size. If you, for example, had a tremor, it’d be much easier to hit the latter! Avoiding accidental taps reduces frustration for these users, which becomes a massive improvement to the usability of an app.

If you’re interested in testing Dynamic Type for yourself, Apple has provided a fantastic guide to adding control to let you change your font size straight from Control Center here.

Coming back to our four categories of vision, with our same approximations when we apply Dynamic Type, both full and partial vision users can now use the app 🎉. That still leaves both low and no vision users unable to use that app. That’s where our second assistive technology comes in!

VoiceOver

VoiceOver is a gesture-driven screen-reader that tells the customer exactly what’s happening on their device. Each UI element is given a label, value (where appropriate) and type by the developer. The user can then navigate the screen, usually by swiping right to go to the next element or left to the previous. A speech synthesiser will vocalise each element using the values set by the developer. When they want to interact with an element, they can double-tap to activate it or swipe up or down to adjust its value.

If you’ve never experienced this before, it might sound a little abstract. In reality, it’s very similar to tabbing through elements on a website, then interacting with them using the enter key. This simple gesture-driven interface might initially sound quite limiting. That’s not the case. With good developer support, everything that a visual user can do can also be done with VoiceOver 👏🏻. Here’s a link to a demo by Apple.

Similarly to Dynamic Type, good support of this assistive technology has additional benefits for users. The labels, values and types that power VoiceOver enable assistive technologies. Here’s a brief description of a few of them:

  1. Speak Screen — An alternative screen reader focused on users with cognitive disabilities who struggle with reading. It uses the same labels and can either read the whole screen top to bottom or a specific element on the screen like a large text block. In the example below (left) it’d read the content above the two buttons.
  2. Voice Control — Allows users with limited mobility to control an app using voice commands. In one of its modes, it uses the labels to display names over each intractable element. A user can then say the label to interact with it. In the example below (middle), a user could say “tap British Pound balance” to open their balance.
  3. Switch Control — Allows control of the phone via an external switch for users with extremely limited mobility. It works by cycling through a high-level section, then when you tap, it’ll focus in and cycle through the details. These switches are usually mounted to a wheelchair. They may be activated by head taps, tongue clicks, or even breathing into a straw. With a single switch, a user can do everything a visual user can. Incredible right? 🤯 In the example below (right), the user could tap to select a manual bank transfer.

If you’re interested in testing VoiceOver for yourself, simply ask Siri to ‘Enable VoiceOver’. Navigating your phone is completely different with VoiceOver enabled, so I’d highly recommend reading Apple’s guide here first. Remember, if you get stuck, you can always ask Siri to ‘Disable VoiceOver’!

Now when we review our four categories of vision and include both Dynamic Type and VoiceOver, all users will can use the app 🚀. When developers support these assistive technologies, they don’t only help those with partial, low and no vision. Thanks to Apple’s fantastic accessibility team, they also empower those with uncommon disabilities through the assistive technologies which use those values; like Speak Screen, Voice Control and Switch Control.

Accessibility can feel daunting, but it shouldn’t be. With minimal additional work, developers and designers can use these tools to build inclusive products that all customers will love.

If you’re interested in learning more about iOS accessibility here’s some excellent links:

P.S. Interested to join us? We’re hiring. Check out our open Engineering roles.

--

--