Designing Accessible Chatbots: An Introduction

Guy TONYE
Voice Tech Global
Published in
6 min readSep 7, 2021

Imagine coming across the graphic below.

Three small square images that are blurry. (This is intentional.)
Which of these three do you prefer?

If I asked: Which of these three do you prefer? Your first thought would probably sound something like:

  • “I can’t see what’s in the image.” or
  • “Are these images supposed to be so blurry?” or
  • “Yikes! I can’t tell what I’m supposed to be looking at!”

This is the exact same challenge that someone who is visually impaired faces when using a chatbot that relies on visuals to provide a conversational experience.

In this article, we will discuss how conversation designers can apply the three steps from the Inclusive Design Framework to make better chatbot experiences for everyone.

Illustration for the medium post representing a braille table next to the post title with a cursive font

Step 1: Recognize exclusion

To better understand the challenges that visually impaired users face when using a chatbot, we have to step into their shoes. This involves changing the way we think about browsing websites and interacting with chatbots.

Try navigating to the chatbot using keyboard only

Web accessibility guidelines emphasize the support for keyboard compatibility, especially for users who cannot see a mouse pointer on a screen.

You can test this out for yourself on this website.

Try to reach the chatbot using the TAB key, then use the enter key to open the bot. For someone who is visually impaired, this is how they would focus on the chat element to use it.

As a conversation designer, you want to make sure that when chatbot functionality is added to a website, a user can reach it with the keyboard only.

Try using a screen reader to test a chatbot

This second test involves adding screen reader capabilities to navigate a website.

As you can guess from the name, screen readers read information about elements on a page, specifically those that a user is ‘on’, or the element that they have TAB-ed to.

If there is text on the screen, the screen reader will read it out loud for the user. If there is an image, then the screen will read the alt text of that image.

You can enable screen reader capabilities by:

  • Turning on VoiceOver on a Mac
  • Installing NVDA on Windows, (please donate if you can to support the maintenance of the application)
  • Installing ORCA if you have a Linux distribution

Once you turn these on, you can use our simulation page again.

Here’s a bonus test. This time, when you’re navigating using the TAB key, try to dim the light or cover your laptop with a piece of paper. Then try to find the chatbot and have a conversation with it using only your keyboard.

Navigation and Web accessibility are not something that conversation design can change. But these two tests will help guide you in recognizing how a site can exclude users and make a chatbot conversation better.

Doing these tests with your own sites can also help you with talking to your webmaster or engineering team if you’re thinking about adding a chatbot to your website.

Step 2: Learn from many

The tests above can help uncover issues in a product or online platform, but won’t help when it comes to uncovering our own biases.

This is why it’s important to include the excluded groups of users throughout the process of building a site or product.

During the research and user testing phases, it is imperative for conversation designers to diversify when it comes to who they interview or ask to test their product. To be specific, you want to include users who are visually impaired in the process so they can evaluate the bot conversation and help identify where there is a mismatch with the way they access your chatbot.

As an example of what you can discover, here are three things we’ve learned from our own user testing:

  1. Emojis can quickly get annoying as screen readers read their literal descriptions. For example, if your chatbot response is: “we finally launched our website 🚀🚀🚀” with three rockets emojis at the end, the screen reader will read it as: “we finally launched our website rocket rocket rocket.”
  2. A bot response is always spoken with screen readers. However, there isn’t a way to add nuance or context. For example, abbreviations will be read as is. In the sentence “The Gov. decided to pass the bill.” It’s not clear if “Gov.” means “government” or “governor”.
  3. Sending multiple small messages can be challenging if there is no obvious numbering that indicates the order of the messages.

There are many more learnings to uncover from conversations with users or by doing digital ethnography, which is a study of how people react to a real digital environment. There are also many resources available for free, such as this Facebook group for blind and visually impaired users.

Step 3: Solve for one and extend to many

Steps 1 2 are helpful to uncover where an experience is already broken. Many of the issues uncovered in these tests can be addressed by existing design techniques.

Let’s go back to our first example of the blurry images.

The issue with many chatbot conversations is that they are designed to rely on the user looking at the screen.

To remedy this, we can change the prompt to ask something that the user can answer without having to see an image or visual.

In other words, we can ask the user for a response that relies on information that can be read.

Let’s say, for example, we were to provide details and phrase the question: Which would you prefer: hiking in the mud, hiking in the sand, or hiking in the snow?

This allows you to answer the question and complete the experience without needing to see the images or even the screen. It works for those who are visually impaired, and, at the same time, enhances the overall user experience.

  • For users that can see the screen, the images supplement their experience and will trigger an emotional response. In our example, with the image only (provided it’s clear), it’s difficult for the user to know how they should respond, such as with a number or general description.
  • For users that don’t understand what the images show, rephrasing the question gives them more context.

We call this technique “designing for voice only.

The sole principle is to create a conversational experience that can be completed end-to-end with voice, without the screen.

Working within the constraints of a voice-only design helps ensure that many user groups with different abilities are able to have a great experience of the product. From this perspective, visual elements (images or chatbot buttons) can be added to supplement the experience but are not essential.

Why accessible chatbots matter

David Dame, a director of accessibility and mentor in our course program, says it best:

I have cerebral palsy, but my money doesn’t. If you want me to buy your product then design one that I can use. If it’s created with a11y in mind, I will show you the money.

Resolving accessibility issues is often seen as a good deed, but in reality, designing for people with disabilities is designing for all of us. Making the adjustments to the design and build process can lead your conversational products to a larger audience that includes everyone, not just those who don’t identify as being visually impaired.

This guide contains more techniques to help you identify broken experiences and implement technical and design solutions to make your product more accessible.

Inclusive design covers accessibility and more. In our Advanced Conversational Experience Design Course, we dedicate an entire week to this topic. Find the full syllabus here.

If you liked this story, please give us one of Medium’s clapping hands clapping hands light skin tone clapping hands medium skin tone.

We are always looking for examples of how we can make conversation design more accessible. Feel free to share in the comments if you have built or know of an accessible chatbot in real life.

--

--

Guy TONYE
Voice Tech Global

Software engineer, Google Cloud Platform Certified Data Engineer, Co Founder @ VoiceTechGlobal