Developing and testing accessible apps in Flutter

Darshan Kawar
Flutter Community
Published in
9 min readOct 1, 2019
https://www.gravenhurst.ca/en/services-and-info/accessibility.aspx

I recently worked on accessibility and implemented it’s missing elements to the apps I am working on.

While doing so, I learned and observed quite a few things about accessibility in Flutter on both platforms which I am going to share in today’s article.

Approach on accessibility during development

We all know how important accessibility is to any app, as it gives an opportunity to users with special needs to access and use the app and therefore, helps to reach wider user base.

One good approach while testing out accessibility for any app especially for visually impaired users is, to turn ON screen reader, close eyes and start navigating through the app. This way, the app would expose minor things to the developers who may not have thought of cases or might have missed adding labels or tooltips to the widgets while implementing accessibility.

I used the same approach to get an idea of how Flutter handles the following aspects:

  • Accessibility on both platforms
  • What are the ways to implement it
  • How to use those ways
  • Where do the apps-under-test stand from an accessibility perspective?
  • Testing

Accessibility in Flutter

Flutter gives developers a jumpstart by identifying and reading most of the widgets on screen as is. Meaning, if a non-text widget doesn’t have a label, tooltip or text property specified, then Flutter would only read that widget based on its type. For instance, an image would be read as {image} or an iconButton would be read as {button}.

As an end-user with visual limitations, these would not be enough and convincing, because they would not know what kind of image is on-screen or what type of button is it. They would always expect that the app will read something like {Company logo image} instead of {image} or {edit button} instead of {button}.

This is when developers would need to add required properties to corresponding widgets and use various accessibility widgets, so that the widgets are read as expected and users would get a smooth and seamless accessibility experience navigating the app.

Let’s now see how accessibility in Flutter works by default with a demo app.

Demo app

As seen above, we have a login screen with an image at the top followed by two textformfields with hintText, a password obscure icon, a switch widget and a RaisedButton.

Now if we turn ON screen readers on both platforms and navigate through the widgets on screen, we notice below behavior:

Note: I used Android Nexus 5X (8.0) and iPhone 6s (11.2.1) configuration.

As we see from the table, some of the widgets are not read as expected (highlighted in blue). Let’s now see how we can make use of accessibility properties to fix the above behavior.

Starting with the image at the top, since I used Image.asset, it provides property named semanticLabel which, as the name suggests, is specifically used for accessibility. Here we provide the appropriate label to the image so that the screen readers can now read it properly.

Image.asset('assets/images.png', height: 100, width: 100, semanticLabel: 'Company Logo'),

Coming to the textformfields with hintText, as the table above suggests, the screen readers don’t read hintText of first textformfield when we select that widget (highlighted in red).

This seems to be an issue with the accessibility framework at the moment and I’ve raised an issue.

As a workaround, if we use labelText instead of hintText, then screen readers reads first textformfield properly as {username editbox} , but password textformfield is read as {Password password editbox}.

One might wonder why the password is read twice. This is because it first reads the labelText password followed by password editbox.

Next, we can use semanticLabel for red_eye icon, so that screen readers will read it as expected.

suffixIcon: IconButton(
icon: Icon(Icons.remove_red_eye, semanticLabel: 'Password obscure'),
onPressed: () {},
)

By using semanticLabel property, we added missing accessibility elements to the screen. Now we can re-run screen readers and observe that all widgets that were highlighted in blue are now read properly.

Let’s now see some of the use-cases to understand how we can make use of the accessibility widgets provided by Flutter.

Semantics:

  1. In the above login screen, instead of image.asset, if we want to get an image from the network, then we would use CachedNetworkImage. This widget doesn’t have asemanticLabel property, so by default the screen readers will read it as {unlabeled image} / {image}.

To fix this, we will wrap this widget with Semantics widget and use its label property. Let’s see how:

Semantics(
child: CachedNetworkImage(
imageUrl: 'https://picsum.photos/250?image=9', height: 100, width: 100,
),
label: 'Company logo',
),

2. Another instance where we can leverage Semantics widget is, if an app has a custom search text field implemented using autoCompleteTextField or any other plugin, by default, the screen readers won’t read this widget.

In this case, we can simply wrap the text widget with Semantics and provide label value so that screenreader will be able to read it.

MergeSemantics:

Let’s say if there’s an account details section in the app that displays user’s email address, name, and company, ie, a static data that doesn’t change-over-time, like below:

Here, by default the user has to tap on every widget to know what that is, but wouldn’t it be more helpful if the entire section is read to the user when they select it ? Enter MergeSemantics.

We can wrap the column widget with this widget and all the static data will be read one by one in one tap.

MergeSemantics(
child: Column(
mainAxisAlignment: MainAxisAlignment.spaceEvenly,
children: <Widget>[
ListTile(
leading: Icon(Icons.account_circle, semanticLabel: 'name'),
title: Text("John Doe", style: TextStyle(color: Colors.blue),),
onTap: () {},
),
ListTile(
leading: Icon(Icons.email, semanticLabel: 'email',),
title: Text("johndoe@test.com", style: TextStyle(color: Colors.blue),),
onTap: () {},
),
ListTile(
leading: Icon(Icons.business, semanticLabel: 'company name'),
title: Text("ABC Inc.", style: TextStyle(color: Colors.blue),),
onTap: () {},
),

],
),
)

Screenreader will properly read the section in one go as :

{name: John doe | email: johndoe@test.com | company name: ABC Inc.}

ExcludeSemantics:

Similarly, if we don’t want user to read a particular element on screen, we can wrap that widget with ExcludeSemantics as below:

ExcludeSemantics(
child: Text('Won't read this'),
),

Below are a couple of widgets that are frequently used in apps and their behavior from an accessibility point of view in Flutter.

Toast message:

A toast message is read on Android by default, ie, when the toast message is triggered, the talkback reads the message when it appears on the screen. We don’t need to explicitly select/tap on the toast message for talkback to read it.

But on iOS, we need to explicitly select the message by tapping on it when it appears, so that the voiceover can read it.

SnackBar:

Same as toast message behavior on both platforms.

Testing accessibility in Flutter

Testing accessibility on the app is as important and critical just like testing out regular features and functionalities before releasing it to production.

If we want to test accessibility from integration testing point of view using flutter driver, we can achieve it by making use of following methods:

setSemantics:

This is the first step to implement if we are to test accessibility of our app under test. We need to enable semantics in order to use below listed methods for testing. To enable semantics, we call this method inside setUpAll() which connects to the flutter driver before running any tests, as shown below:

setUpAll(() async {
driver = await FlutterDriver.connect();
driver.setSemantics(true);
});

bySemanticsLabel:

Like byText, byType, and byValueKey which are used to uniquely identify elements on the screen, Flutter driver provides bySemanticsLabel which, as the name suggests, is used to uniquely identify elements that have semanticLabel defined. In our demo app, since we have used semanticLabel for image, we will declare a serializaber finder as below:

final imageLabel = find.bySemanticsLabel('Company logo');

getSemanticsId:

We will use this method to assert that the imageLabel finder is recognized by our test by passing isNotNull as the parameter to it:

expect(await driver.getSemanticsId(imageLabel), isNotNull);

What getSemanticsId() does is, retrieves the semantic node id for the element returned by the finder. If we want to retrieve the node id of image, we can achieve that as below:

int id = await driver.getSemanticsId(imageLabel);
print(id);

A complete test to identify the image with semantic label and then tapping on it, is as below along with the test result:

test('validate image accessibility', () async {
final imageLabel = find.bySemanticsLabel('Company logo');
int id = await driver.getSemanticsId(imageLabel);
print(id);
expect(await driver.getSemanticsId(imageLabel), isNotNull);
await driver.tap(imageLabel);
print('tapped on image');
});
});

Like this, we can write comprehensive integration tests to validate accessibility of app under test that will help not only to maximize the test coverage but also build a solid and robust app.

With the kind of accessibility widgets Flutter provides including semantic properties, the job of developers to implement remaining pieces of accessibility to give a smooth experience to the specially abled users, becomes easy and testing support in the form of widget tests or integration makes the entire development effort of building an app, a seamless experience.

Implementing accessibility during development process is a good practice to make an app accessible and stand out from other apps and we, as developers, should strive to make it inclusive. As we saw above, it’s not too difficult to make an app accessible because Flutter provides solid support for it.

That’s all for now. I hope you liked what you read here. Thanks for reading and feel free to comment below your thoughts or any suggestions/feedback on this article.

I am available on Twitter, LinkedIn, and Github.

My other articles on Flutter are:

--

--

Darshan Kawar
Flutter Community

Open Source Support Engineer For Flutter @nevercodeHQ. Android Nanodegree Certified. Previously, Android Automation Test Engineer.