Inclusive design improves UX for everyone
…so let’s make it our default
You’re sat in the quiet section of a train, and you have a TV show you really wanted to watch.
…but you forgot your headphones.
But then you remember, you can watch the TV show with subtitles! Suddenly, you have been enabled to continue normal usage of your device.
Well, that’s what inclusive design can do. It’s not just for those with disabilities, inclusive design can improve the lives of everyone it touches, and their experiences with the products they use.
Ways to design inclusively
Inclusive design can come in all different forms, and it isn’t just restricted to the web.
Don Norman — ex Apple VP and author of the Design of Everyday Things — puts it very nicely:
Curb cuts were meant to help people who had trouble walking, but it helps anyone wheeling things: carts, baby carriages, suitcases.
And it’s the same case on the web. Closed captioning helping you when you forgot your headphones, higher contrast text when you’re out in the sun, larger button sizes for when you’re using the phone with one hand (the list is endless).
Online, it is recommended you have a good contrast between typefaces and the background they are on. For example, the WCAG 2 AA standards require a contrast ratio of at least 4.5:1 for normal text (around 18px) and 3:1 for large text (more than 18px).
However this does not only help those with poor vision. For example, how many times have you been sat in a class or lecture, and you were bombarded with slides like this:
Not great is it? It’s not comfortable to read, and so mentally you’re beginning to switch off. Now imagine if we took the advice of the WCAG and made the slide look like this:
That’s a bit nicer.
Another aspect of legibility is down to font-sizing. Generally, 18pt and 14pt are an acceptable minimum size for title text and subtext respectively. This roughly translates to around 24px and 18.5px. For body text, something like 12pt (around 16px) usually is the minimum readable font size.
Recently I was refactoring some front-end code for a project I was on, and before it was redesigned, the font size was around 14px. The UI looked nice. However, with these accessibility constraints, I worked to find a way to make the UI comply, and after the refactor it looked even better. It was way more legible, and the main information a user would need was comfortable to scan quickly.
Now picture you were using this tool outdoors on a sunny day, you’d be really glad that the font-sizes were slightly larger and the colours used on the text were of a higher contrast. These accessible and inclusive design changes have really improved your experience.
When filling out a long form online, to speed up the process I tab between input fields. It makes what can be a tiring and cumbersome process slightly quicker, and I can then move on with my life.
However, tabbing between elements wasn’t just developed for you to fill out forms quicker.
Some people cannot use the conventional mouse or trackpad when navigating around web pages. They rely on physical buttons to navigate. Typically the tab and enter keys.
By default, web browsers are pretty good at automatically deciding what a user can tab to and what they cannot. For example, if you use a button, the web browser should automatically let you tab to it. If you use a div however, the browser will ignore it.
That’s great, but sometimes a button doesn’t cut it for our design and we want to use a more custom, clickable element.
A good rule of thumb is, if there is a click event on any elements, the user should be able to tab to them. HTML makes this pretty easy too, all you have to add to your element’s attributes is tabindex along with the value “0”. For example:
<div tabindex=”0” onclick=”doSomething()”>Click me!</div>
What’s also important to note is that some frameworks don’t interpret the press of the enter key as a click, so you may need to so something like this in Angular (to name one):
<div tabindex=”0” (click)=”doSomething()” (keyup.enter)=”doSomething()”>Click me!</div>
Going back to the train example, imagine if even when you can’t listen to something, you could still understand what is being said.
Well, unless you’re a great lipreader, you’re most likely going to have to rely on closed-captioning.
Closed-captioning became a reality in the early 1970s, after a failed experiment to send timing data along with TV signals. Now, closed-captioning is fairly commonplace. Think about when you’re watching YouTube or Netflix, you’re probably only one or two clicks from being able to enable captions.
A great recent innovation with CC comes from Google. At their latest IO event, they revealed that a ‘Live Caption’ feature will accompany the Android Q release — meaning that closed captioning is done natively by the Android operating system, and turns the spoken word into text on the screen on the device. You won’t even have to be connected to the internet.
While Google noted that they worked closely with the Deaf community to develop this technology, Google’s CEO Sundar Pichai said:
“You can imagine all the use cases for the broader community, too, for example, the ability to watch any video if you’re in a meeting or on the subway without disturbing the people around you.”
Now, think about when you’re on a slow connection.
Imagine you’re looking at an article online, you’re enjoying it, but the images aren’t loading because of the weak connection. Well, it’s now pretty common to have some text describing the image if it can’t load for whatever reason. This is called alt-text, and it’s really easy to add to your site.
However, it’s not only for us sad people on the equivalent of dial-up connection speeds. It’s mainly aimed at those who can’t see the image well, so their screen-reader can describe it to them.
Daniel Göransson has written a great piece on how to create the perfect alt-text, but the key points are:
Describe the image in context. For example, if there is a group of people out in the rain, depending on the context of where that image is, you could use alt-text of “Stormy weather in city.” if we were on a weather site, or if the image was on some sort of social platform you could say “Group of friends enjoying themselves in the rain.”
Don’t include unnecessary/annoying information. For example, the name of the photographer is not important to someone trying to understand what the image looks like, nor keywords you’re trying to rank higher for on Google.
Don’t say it’s an image. Screen readers will already be telling users this. If you start with “image of…” the screen reader will recite something like “image image of…” to the user. This is simply annoying to anyone using a screen reader.
Keep it concise, and end with a period. You don’t need to go overboard with your description, and if you end with a period, the screen reader will leave a short pause before continuing on to the rest of the content.
“The one argument for accessibility that doesn’t get made nearly often enough is how extraordinarily better it makes some people’s lives. How many opportunities do we have to dramatically improve people’s lives just by doing our job a little better?” — Steve Krug
As designers, developers, and as human beings we should be thinking about everyone when creating a new product for the world. Inclusive design shouldn’t be an afterthought, it should be our default. It adds so much value for everyone, not just those who rely on it.