Designing and Developing your Android Apps for Blind Users (Part 2)

Alastair Craig
6 min readJun 26, 2019

--

As developers, what should we be doing to help optimise the experience for visually impaired users? What can we do to help improve the testing approach to create that experience?

Here I detail ten points to help you on your journey to making your app more accessible.

1. Use view.setContentDescription liberally

Here we have an issue with the a screen that shows a masked phone number as part of a card activation screen.

Without a content description set, the screen reader will read the text literally.

By setting the content description of the view we can optimise the experience by getting the screen reader to announce something with more context.

Here we see a 16 digit card number entry field.

The screen reader will read out the the long number in groups of thousands.

In this example because the text is is dynamic, we set the set the content description of the node info for the view in order announce the updated text.

public static void readOutCardNumberDigitsIndividuallyAndInGroupsOfFour(TextView textView) {

textView.setAccessibilityDelegate(new View.AccessibilityDelegate() {

@Override
public void onInitializeAccessibilityNodeInfo(View host, AccessibilityNodeInfo info) {
super.onInitializeAccessibilityNodeInfo(host, info);

final String cardNumber = cardNumberContentDescription(textView.getText());

info.setText(null);
info.setContentDescription(cardNumber);
}
});
}

The Text view of the EditText can then be placed inside the the parameters of readOutCardNumberDigitsIndividuallyAndInGroupsOfFour()

2. Use native components where possible

Here we have an example of TextView that has a clickable action to open another UI component.

But because it’s a TextView there won’t be a button announcement as feedback for the user. This can create an inconsistent user experience.

When the focus is over a button view it will always announce “button” after it has read out the text on the button.

Here we have an example of a custom radio button group that has not extended a radio button or group class.

Here there is no feedback for the user that the an item is selected or not selected.

In this case because the the custom view was tightly coupled we set the content description manually to announce if the radio button is selected or not selected.

3. Think about how visual signifiers like colored text can be translated in the content description

Visually you understand that that red text is a signifier that something has gone wrong. But in order to convey this information to the blind user, the error message should be explicitly stated.

By setting the content description of the text view we can prefix the word “Error: ”so that a blind user will understand the context of the message.

4. Add ”Heading” announcements to titles and sub titles

Something that was stressed to us by the RNIB (Royal National Institute for the Blind) UX researcher was that the word “Heading” should be suffixed onto every title and subtitle. this helps with the mapping of the screen content for a visually impaired user.

5. Add announcements when there is an update on the UI that is not reflected in the spoken feedback

When the UI updates on screen, the user should be made aware of that through some form of feedback.

In this example: the spoken feedback is switched on, but when the number is changed by scrolling, there is no announcement that the value in the center has changed.

Changes on the screen should be announced usually via a listener.

Luckily the View class can call: announceForAccessibility(CharSequence text)

The device will immediately announce the CharSequence placed within the parameters. In this example, the number value changed in the number picker.

6. Check your RecyclerViews

If your RecyclerViews have multiple view types, you should test them with the spoken feedback turned on. Devices running Lollipop and Marshmallow operating systems can sometimes read out all the views of the same view type at once.

This problem can be fixed by setting: android:importantForAccessibility=”no” as an attribute to the RecyclerView in the layout xml file.

7. Switch on the text output for spoken feedback

In the Voice Assistant or Talkback settings, there are developer options and from here you can toggle on the text output. This is a great way to help you test the app as you won’t have to have the sound on all the time — you also won’t annoy the colleagues sitting next to you.

Samsung devices
Pixel and Nexus devices

8. Use an accessibility strings xml file

We found that as we added in more and more content descriptions for views that our main strings file started to swell, so we created a separate strings file to hold all the accessibility content for the spoken feedback. It’s made it easier to manage for us, and if you’re thinking about looking into accessibility for your app, I’d advise you to do the same.

9. Check your Analytics for how many people have the spoken feedback on

If you use analytics in your app, you should consider adding a metric to check how many of your customers are using Voice Assistant or Talkback. In the example below we can send an analytics event checking the isSpokenFeedbackEnabled() method in the Application class.

private AccessibilityManager accessibilityManager;public boolean isSpokenFeedbackEnabled() {

return hasAccessibilityFeatureTypeEnabled(AccessibilityServiceInfo.FEEDBACK_SPOKEN);
}

private boolean hasAccessibilityFeatureTypeEnabled(int type) {

List<AccessibilityServiceInfo> enabledServices =
accessibilityManager.getEnabledAccessibilityServiceList(type);

return enabledServices != null && enabledServices.size() > 0;
}

10. Test the entire app and note down the bugs

Go through your app and try to note down and create bug tickets for every issue you find.

You will probably find quite a lot as we did.

And when you feel more confident in being able to navigate around with Voice Assistant/Talkback on, try to use the app without looking at the screen This will really help you understand the UX requirements.

What’s next for us?

Many of our features are now 100% accessible and almost all Accessibility issues with the app that were presented to us by the RNIB have been fixed.

All new features are designed to a set of guidelines. From the report that was given to us by RNIB we have come up with a set of accessibility app guidelines as a standard develop to.

Once the RNIB issues are fixed in both Android and iOS apps we can apply for accreditation. This involves getting the app tested by a group of visually impaired users and being approved by them.

Finally, lots of the issues we have covered in this article are currently being packaged up into a Library we re developing. Currently this in the inner sourcing stage and we hope to outer source it as soon as possible to help you improve the the experience in your app for visually impaired users. I will announce when this is released on my twitter account here:

--

--