Photo by Daniel Ali on Unsplash

Introducing React Native Accessibility Engine

Aryella Lacerda
Feb 28 · 13 min read

Let’s first start by clarifying what accessibility even is. Despite today’s astounding access to information, it’s still a widely misdefined concept. That’s not all down to ignorance on our part.

The definition of accessibility has changed — often, for different reasons and in subtle ways — as we’ve undergone a slow but steady cultural shift regarding the way we look at disability. These cultural shifts affect every part of the world that a person with disability might interact with, which definitely includes the products we create as mobile developers.

And we know that being a great developer is about more than just razor-sharp programming skills. It’s also about taking responsibility for the quality of our products, taking initiative for our learning, and empowering others to do the same.

So let’s begin.

The definition of accessibility

This is what the New Oxford American Dictionary had to tell me about accessibility:

the quality of being able to be reached or entered;
the quality of being easy to obtain or use;
the quality of being easily understood or appreciated;
the quality of being easily reached, entered, or used by people who have a disability;

Probably, most people still think of the fourth and last definition. Unfortunately, it’s the most restrictive and outdated definition. MDN Wed Docs explains what the problem here is:

Accessibility is the practice of making your websites usable by as many people as possible. We traditionally think of this as being about people with disabilities, but the practice of making sites accessible also benefits other groups such as those using mobile devices, or those with slow network connections.

Even though MDN’s definition is web-oriented, it’s easy ported to mobile. Accessibility is about more than making your app accessible to people with visual disabilities. It’s also about making it equally accessible to people who can’t see color, or who are holding young children in one arm, or who are living in places with poor internet, or who suffer from motor issues, or who never had the opportunity to get an education.

The state of accessibility in React Native

React Native’s accessibility API got a facelift in 2018. There’s a very interesting article where one of the engineers on the team explains the problems in the previous API and everything that was done to mitigate them. Today there’s a dedicated API and it works well, both in iOS and Android. We’ll see more about that later.

First, let’s get a bird’s-eye view of the types of tools available at various stages of the development process for building accessible React Native apps.

Analysis tools

Analysis tools are generally used toward the end of the development process. Once you have something that can be rendered on a simulator/emulator/device, analysis tools will do as their name implies: they’ll analyze the elements on screen for accessibility failures and return warnings or errors where appropriate.

On iOS, you can use the Accessibility Inspector. It ships with macOS and can be accessed easily via Xcode or the macOS Spotlight. Here’s a nice guide on using and interpreting the Inspector’s results.

On Android, you can use Accessibility Scanner. To use it with an emulator, download the Accessibility Inspector APKand then drag-and-drop to your emulator. Afterward, you can take a look at their getting started guide.

Some things to note:

Development tools

ESLint’s a11y plugin is fairly well-known and available for React Native. The power of code linting is limited, however. A11y can tell you that the accessibilityState prop takes an object, but not when the prop should be used, so while using it is highly recommend, I can't say that it's enough to make an app accessible.

In terms of other early-development tools, however, React Native’s otherwise mind-boggling ecosystem seems to come up short. There are plenty of tools for React Web, but few of them have been ported to React Native.

Among the tools available for React Web, you’ll find Axe, an open-source accessibility testing engine. You can use Axe to run assertions during automated testing. If a newly rendered component has some accessibility issue, the test fails and returns a report. It can be easily used with Jest or other test runners as part of your regular code quality pipeline.

An Axe test looks like this:

However, Axe only supports HTML-based languages, which unfortunately excludes React Native. 😕

The company that maintains Axe, Deque, offers similar services for native iOS and Android apps, which mighttheoretically work for React Native, except they’re separate paid services and you’ll probably need some knowledge of native development to make them useful. It’s not every React Native developer that relishes the thought of crossing over into native territory.

There’s also a Storybook add-on built on top of Axe. If you use Storybook for React Web, all you have to do is install the add-on and receive a detailed report of your components’ accessibility issues, right in the Storybook panel.

Since the plugin is built on top of Axe, though, it’s not easily ported to React Native. I’ve looked, but so far haven’t found a React-Native-compatible replacement for Axe. If you know of any, please let me know!

React Web is also fortunate in one other regard: web accessibility’s maturity. The Web Content Accessibility Guidelines(WCAG) have been around since 1999. (Do you remember what computers were like in 1999?) That’s when W3C released the WCAGs 1.0. The WCAGs 2.0 were released in 2008 and were succeeded by the WCAGs 2.1 in 2018.

That’s 22 years of testing and learning and improving, only a few of which overlap with the birth of smartphones, mobile-first websites, and mobile apps.

The first public working draft of the WCAG 3.0 was released on the W3C website on January 21, 2021. If you are an evaluator, developer, designer, project manager, policy maker, person with disabilities, or any other interested party — take a look! If you can, contribute. ♥️

This means that it’s harder to find good, updated, trustworthy content from which to learn mobile accessibility. If you Google react native accessibility, you’ll find the official documentation and a handful of interesting guides explaining the various accessibility props. Some still use the pre-2018 API and most of the others simply lack context and practical examples.

It’s perfectly fine to know what a prop does but how am I to know when and how to use it? 🤔

Let’s take a look at the accessibility API through an example hopefully full of whens and hows.

A brief exploration of the accessibility API via a practical example

Let’s consider one of the most basic building-blocks of any app: the button.

What we have here is just a button implemented as a functional component. If we run an audit on this beauty using macOS’s Accessibility Inspector, this is what we get:

Set an appropriate size for interactive components

Let’s consider the first warning: hit area is too small.

The inspector helpfully tells us the current size of the TouchableOpacity view: 30pt x 30pt. This confuses some people. 30pt x 30pt may seem a perfectly fine size for a circular button! Indeed, many a floating button or tab bar button are about this size or smaller. If this is your experience, try using an app one-handed while holding a squirming infant. Or try using an app inside a car moving over rough terrain (provided you aren't the driver, of course).

From the iOS Accessibility Guidelines:

Give all controls and interactive elements a hit target that measures at least 44x44 pt. People with limited mobility need larger hit targets to help them interact with your app. Controls that are too small can be frustratingly difficult for all users to hit.

From the Android Accessibility Guidelines:

Your app’s UI is easier to use if it contains controls that are easier to see and tap. We recommend that each interactive UI element have a focusable area, or touch target size, of at least 48dp x 48dp. Larger is even better.

A button’s touch area or touch target size is the area within which a user’s touch will be recognized. Usually, this area corresponds to the button’s boundaries (in our case, anywhere within the red circle) but we can also extend the touch area beyond the component’s boundaries.

#1: Increasing the size

This is the simple solution. Increasing the button’s size will automatically increase its touch area. Here, we set height and width to 48pts in order to stay within both OS’s guidelines.

Now, when we rerun the audit from the Accessibility Inspector, we see the warning has disappeared:

#2: Increasing the touch area

The second solution involves extending the button’s touch area beyond its borders. That means that touches that land slightly outsize the button’s borders will still be recognized. You can control how far this area extends using the hitSlop prop.

There are two things we should remember here:

The touch area never extends past the parent view bounds and the Z-index of sibling views always takes precedence if a touch hits two overlapping views.

Now, when we rerun the audit from the Accessibility Inspector, we get this:

The warning hasn’t disappeared! 😰

That’s because the inspector doesn’t take the hitSlop into account when auditing components. And even though we can successfully press the button even when our touch lands slightly outside the red area, we don’t visually know this is possible until we try it. For these reasons, prefer increasing the button’s size instead of extending its touch area whenever possible.

Set a label for all non-text elements

Let’s move on to the last warning: element has no description.

What this means is that, when the screen reader reaches the button, it’ll inform its user: “Description for element unavailable.”

Extremely helpful, that.

This is what’s happening behind the scenes: the accessibility API is looking for some label with which to describe this element. When a button contains text, then that text implicitly becomes its label.

So what happens when the button is an icon button like this one, something that’s very common in today’s mobile designs? In this situation, we must add an accessibility label manually!

According to the iOS Accessibility Programming Guide, an accessibility label:

Let’s adjust our code accordingly:

When we run another audit, this is what we get:

Hooray! 🎉 Our work is done… right?

Not quite.

When the screen reader gets to this button, it will inform its user: “Like”. That’s it. No context. Is “like” static text, an icon, a button? Let’s assume, for the sake of argument, that the user somehow guesses this component is a button. Is it clear what the user is liking? If the button is disabled, is the user going to be informed?

Let’s address each of these problems.

Set the role of the component

This allows the user to understand how to interact with a given component. They’ll know that they can touch buttons, adjust sliders, switch tabs, press links, toggle switches, check checkboxes, etc. The full list of available roles can be found in React Native’s Accessibility API docs.

In our case, we should add an accessibilityRole prop to our button.

Now, the screen reader will inform its user: “Like, Button”!

When the result of an interaction isn’t clear enough, add a hint

Obviously, our button component doesn’t actually do anything because it’s an example. Generally, when we have interactive components, their placement and labels provide enough context to allow us to understand what they do.

(If not, this accessibility problem might actually be a design problem. 👀)

But in case clarification is required, we can provide the user with a hint via the accessibilityHint prop.

According to the iOS Accessibility Programming Guide, a hint:

ESLint’s a11y plugin recommends that you add an accessibiltyHint whenever you add an accessibilityLabel. However, the iOS APG suggests that you only use accessibilityHint when absolutely necessary.

We can edit our code accordingly:

Now, the screen reader will inform its user: “Like, Button… Likes the song”.

For reasons unknown, the Accessibility Inspector won’t read out the hint, but the device will. It’s also important to note that users can disable hint-reading in their screen reader settings, so as much as possible, try to rely instead on a clear and accessible design.

Expose interaction state to the user

But what if our button is disabled? What about other components that can be selected, checked, expanded, or that might be temporarily busy? These are all examples of interaction state, which aren’t automatically exposed to the screen reader.

For example, setting the disabled prop on our button doesn’t automatically prompt the accessibility API to recognize this state (I was confused, too). We have to add it manually via the accessibilityState prop, which takes an object with a few different keys, one of which is disabled.

You can check out all of the available states in the official documentation.

Now, when the button is disabled, the screen reader will inform its user: “Like, Button, Not Enabled… Likes the song”.

Wrapping it up

This is in no way a comprehensive guide to mobile accessibility. In fact, it’s not even a comprehensive guide to making the button component accessible: we skipped over concerns like color contrast and what the accessible prop does. But hopefully, it was enough to illustrate that mobile accessibility isn’t something we can learn in an afternoon or from a single guide.

Wouldn’t it be great if we had a tool that could point out accessibility concerns as we develop our components? Something that would allow us to learn about accessibility in steps instead of all at once? Something that could tell us the best practices and offer resources when we have questions?

Introducing the React Native Accessibility Engine

That’s the goal of React Native Accessibility Engine.

In short, it provides an engine capable of traversing a component tree and making accessibility-related assertions as it goes. If it finds something that needs correcting, it throws an error which can then be caught during the early development phase. It should fit easily into our existing testing pipelines and do the heavy lifting for us, in terms of making our apps more accessible.

Here’s a taste of what it can do:

If you’ve got any questions or tips, leave a comment below. I certainly don’t know everything there is to know about accessibility, so if you’re an evaluator, developer, designer, project manager, policy maker, person with disabilities, or any other interested party — take a look! And if you can, contribute. ♥️

References

React Native 2018 Accessibility Updates
Axe: Accessibility testing engine
Daque Labs
Real-time Accessibility Testing with Storybook
iOS Accessibility Guidelines
Android Accessibility Guidelines
ESLint React Native A11y plugin
React Native Accessibility Engine

React Brasil

Tudo sobre o mundo React

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store