What Designers Can Learn From Google About Accessibility
This year at I/O17, Google’s Astrid Weber issued a challenge to teams:
“Think creatively about how to find people to test with. Try and find one
person with accessibility needs in your next research study and see what
Astrid is a lead UX researcher at Google and as a UX designer myself, I found Astrid’s call to arms to be one of the most exciting occurrences at this year’s conference.
One argument against adding accessibility features (yes they really exist) that I’ve heard recently is: “Disabled people only make up 10% of the population, and with such a wide variety of disorders requiring accessible assistance, how do you pick which disabilities to design for? And what is the point, isn’t that just designing for edge cases?”
Well no, my straw-manning friend! You need to remember the #1 tenet of accessible design: building accessibility into your design helps everyone.
If I design for someone with a motor disorder who can’t hold their phone, users more generally will also benefit from an improved hands-free functionality. If I select a contrast, colour palette and font size that assists someone with low vision, that’s going to help you read your screen in bright sunlight. Everyone wins.
It’s a safe bet that at some point in your life, whether you have broken an arm, are recovering from an operation, or just realised you need glasses, you’ve used accessibility features already. And even if you’re able-bodied now, who says you’ll stay that way? With an ageing population and people living for longer, accessibility is real and only going to get more important.
So, onward towards inclusion! Here are some of the key announcements around accessibility that came out of this year’s Google I/O.
Do you know what’s super exciting? Automated testing for accessibility. Google have created an automated accessibility test framework, and have developed an opensource library that can be integrated with Espresso or Robolectric. It provides run time evaluation on real android UI constructs.
Essentially, this means that once you enable the accessibility test framework in your build tool, you’ll start to see where your builds fail against accessibility metrics, which is really useful!
Of course, you shouldn’t be skipping your manual testing, especially when it comes to your accessibility and usability features. io17 gave us some excellent tips for manual testing of accessibility features, like always working to be familiar with the accessible experience. For example, to test a text-to-speech reader:
- Turn it on
- Close your eyes
- Test the most common flows
Ask yourself: can you do and find everything you could with your eyes open?
Manual testing helps you find missed text labels, or a sidebar menu where the information architecture doesn’t make sense. Make sure your labels are describing functions, not features. Don’t label something ‘an arrow icon’. Label it with its action.
Want some real talk? I recommend going to the play store and looking at the comments on the Google Talkback app. Read the difference in the reviews from able-bodied people and people with disabilities, and you’ll get a bit of insight into exactly how little able-bodied people really understand the disabled experience. You need to be able to bridge that lack of knowledge in order to truly help people to use your products.
Google’s accessibility scanner is currently at v1.1.2 and is already proving to be a game changer for developers and people who have accessibility needs. It’s a downloadable app that you can use to scan applications to find issues with accessibility.
As you can imagine, this is an invaluable tool for developers who are trying to incorporate accessibility features, but also for someone wanting to know if they would be able to use an app.
Simply turn it on, open the app you want to scan, then press the accessibility scanner button.
Once running, the app will suggest accessibility improvements such as increasing contrast, increasing touch targets, or adding content descriptions.
This is a great way to introduce yourself or your teams to the basics of accessibility, particularly as an addition to testing.
Accessibility in Android ‘O’
Google is really throwing down the gauntlet around accessibility, and some of their new tools and updates to existing tools are really setting the bar high for developers when it comes to implementing accessibility features. Once 8.0 goes to full release, there will be even more accessibility tools to try.
TalkBack is an accessibility service that helps blind and visually impaired people. As well as a few tweaks and performance issues (current release can be found here), Google has recently added a new service called Select to Speak.
Select to Speak lets you tap items to hear them spoken aloud, instead of waiting for everything to be read out, improving things like the article reading experience.
Another thing to look out for with Android O is the update to menu labelling. If you’re vision impaired, having your screen read out to you is great, but what happens if you can’t work out what the labels mean?
Moving to more descriptive categories means users who are using a screen reader have a better understanding of what functions lay behind different menu options.
They’ve also added new gestures that will be assignable to tasks. So swiping up, down, left and right can all now be assigned a function. This will only be available on O devices, and developers will need to request a special flag to use fingerprint gestures.
Also coming in O, accessibility volume control. You will be able to use a slider to adjust the volume of the readers speech, separate to the volume of media, to stop your reader talking loudly over the top of whatever you are listening to!
To make it even more impressive, text-to-speech is about to have multilingual support added, meaning that you can now have your text-to-speech read to you AND switch between languages in the same content block seamlessly.
Google demo-ed this tool by having the app read out an email written in four languages- French, Polish, Ukrainian and Chinese, and it had to be seen to be believed!
Developers can use LocaleSpan to wrap the string to manage that change.
Accessibility resources and training
Want to get your team across accessibility?
The following resources were curated and recommended by Google staff as a way to get awareness levels up and allow self-driven teaching around accessibility.
A new accessibility section has recently been added to the Android developer pages.
This outstanding resource helps developers to understand why accessibility is so important, offers some training and upskilling, and helps developers get amongst the nitty gritty of building accessibility services.
For designers wanting to make their work more universally accessible, there’s a design-specific section in the Material Design Guidelines.
Google-led document resources
Google have put together a valuable free learning resource about accessibility.
This resource will allow more people to:
- Learn what accessibility means and how it applies to web development
- Learn how to make websites accessible and usable for everyone
- Learn how to include basic accessibility with minimal development impact
- Learn what HTML features are available and how to use them to improve accessibility
- Learn about advanced accessibility techniques for creating polished accessibility experiences
Created by technical writers and SMEs specifically for people wanting to learn more about accessibility, this is such a clear and usable resource. I highly recommend it.
Vox accessibility guidelines
Vox media started this project back in May 2016 to implement some company-wide structure and standards around accessibility. The result is a fantastic collection of tips, guidelines, and a checklist to help you build accessibility into your process, no matter what.
A11ycasts with Rob Dodson
Not a reader? Check out the A11ycasts podcast on YouTube! Rob Dodson is a passionate advocate for accessible design and has the dev skills to back it up. He’s created a series to help developers learn the fundamentals of accessible design and releases a new video every two weeks or so.
If you’re reading this you might have heard about the Web Content Accessibility Guidelines, or WCAG. WCAG 2.0 are recommendations for making web content more accessible. WebAIM is a checklist based on the standards, a nice quick resource to check off what you are doing well and what you can improve on.
Amazing gadgets and other hardware treats
I can’t resist these — they were in the sandbox tent and I live for checking out this kind of stuff at conferences.
Selfie stick utilising Talkback 5.2, native android screenreader
If your eyesight is less than perfect, you might have discovered that using a selfie stick can be hard! With this setup, use your forward facing camera with Talkback 5.2 enabled, which will give you spoken feedback, describing what it can see on the screen (e.g. “one face, centred”), allowing you to take a decent selfie even when you can’t see your screen.
Unless you personally know someone with a motor or movement disorder, you have probably never considered how difficult it is for someone with even a mild tremor to eat their daily meals.
Liftware has created two lines of spoons and forks called ‘level’ or ‘steady’. ‘Steady’ electronically stablises the utensil on the journey from your plate to your mouth, making it ideal for people with tremors. The ‘Level’ range assists people with a limited range of motion to hold their utensils at the correct angle for eating. The impact these products have on people’s lives cannot be underestimated. People who could previously only eat with assistance can suddenly feed themselves again. People who felt embarrassed to eat in public can suddenly enjoy restaurants and eating in public without anxiety. Freedom aside, even just a small amount of dignity restored can do wonders for people who struggle to perform tasks they used to do without issue.
Check out this video of the Liftware Level in action.
Thanks for joining me on an accessibility journey through io17. Let’s all work together to create a more accessible future!
Angela Cox is a UX Designer at Outware Mobile Sydney.
Originally published at www.outware.com.au.