Cutting the Curb: The Power of Accessibility Research

Accessibility research at Facebook is shaping advances for people with disabilities — and everyone else. Here’s how and why to incorporate it into your UX practices.

Yao Ding
Meta Research
9 min readMay 18, 2021

--

The cover image for the page, showing variety of people standing side by side.

“Usability and accessibility are twins separated at birth. Same goals but like two brothers in a fable, they took different paths: Accessibility took a legal rights path. This gave it power, but not a lot of love. Usability took a user research path. This gave it deep insights, but not a lot of power.

What happens when these two meet? Can we get deep insights and great power?”

— Whitney Quesenbery, Better Accessibility Needs User Research

The answer to Whitney Quesenbery’s question is undoubtedly yes. Every day at Facebook, accessibility user research is helping deep insights meet great power. We believe user-centered, inclusive research and design can — and must — elevate product experiences for all people. In keeping with the slogan “Nothing about us without us,” accessibility solutions must be based on deeper user understanding and validation through user research, rather than just following guidelines and ticking boxes.

Recent advances at Facebook demonstrate how effective this approach can be. This article will focus on how accessibility research helped shape one of those advances, Automatic Alt Text, and provide some guidance for incorporating accessibility research in standard UX practices.

Recent accessibility advances

Guided by accessibility research, Facebook has been making great strides in improving product experiences for users with disabilities. Since 2018, for example, we’ve been making continuous improvements on keyboard control and navigation for users with physical impairments who mostly rely on keyboards and alternative input devices (which mimic keyboards) to access Facebook. In September 2020, we launched auto-generated video captions on IGTV and FB Live for people with hearing loss. Our long-term goal is to roll out auto-generated captions to most video content and audio/video calls.

But the advance we’ll dive into here is Automatic Alt Text (AAT). In January 2021, we rolled out a major update to AAT that uses AI and computer vision technology to automatically generate image descriptions that can be read out by screen readers, conveying the meaning of an image to users with visual impairments. AAT can now identify over 1,200 objects and concepts — 10 times more than it could upon its first release in 2016.

Illustration of a woman holding a phone to her ear, listening to alt text read-out.

Accessibility research for Automatic Alt Text

Very few of the 2 billion photos shared every day across Facebook products come with alt text. As social media becomes increasingly visual, we believe AI-powered AAT can fill this gap and enable users with visual impairments to enjoy their experience with Facebook the same way others do.

Understanding pain points and user needs

Before AAT’s initial launch, we had to answer some fundamental questions. How much do photos matter to blind users? How useful would they find AAT? A large-scale quantitative analysis of 50,000 blind users showed that blind users are just as active and productive on Facebook as non-disabled users — in fact, their posts receive more comments and likes on average. But blind users upload, comment on, or like fewer photos, and receive fewer comments or likes on their own photos.

An in-depth interview study helped us capture a clearer picture. Although blind users are interested in visual content, they often feel frustrated and even excluded or isolated because they can’t fully participate in conversations centered around photos and videos.

Identifying what people value most

After AAT had been providing image descriptions for several years, we wanted a better sense of how useful the descriptions were. What did people want to know about the images that we weren’t providing? How much information was too much, or not enough? We conducted a series of studies to answer these questions.

As the visual recognition engine became increasingly powerful, AAT could potentially recognize millions of objects and concepts that can describe a photo. We conducted a maxdiff survey to rank order common visual concepts. The three concepts that blind users appeared to value most were human interactions (hugging, kissing, etc.), landmarks (Eiffel Tower, Taj Mahal, etc.), and scenes (inside of an elevator, ranch, train station, etc.). These findings helped us prioritize high-value concepts in computer vision engine development and in AAT rendering.

Evaluating UX to inform iterations

In our standard user-centered design process, we always evaluate a product with users with disabilities to inform product and feature iterations. A moderated remote usability study on AAT revealed some surprising findings. Participants had split opinions on the tradeoff between more information versus more accurate information.

This led us to consider a setting that let users choose between higher accuracy with fewer details and medium accuracy with more details. We also found that blind users value position information the most — more than size and people/objects. As a result, we changed our initial design to order the three main information categories as position > size > objects. We also redesigned AAT’s entry point in response to user feedback that AAT was not very discoverable for first-time users.

A cell phone screenshot of a news feed showing a picture, and a speaker icon showing the cell phone highlighting the picture and announcing may be an image of 1 person, standing, and Machu Picchu.
The automatic alt text would let the user know the image may be showing 1 person, standing, and Machu Picchu.
A cell phone screenshot of actions menu of a post, containing 5 items: like, react, comment, post menu, and generate detailed image description. The last item is highlighted by the screen reader cursor.
A cell phone screenshot of an information page of detailed image description. The page reads may be an image of 5 people, including Jay Youmens, people playing musical instruments, people standing, 2 hats, and 5 drums. Position information — at the top 2 hats, in the middle 5 people and a drum, at the bottom 4 drums; Size information — primary elements 3 people, secondary elements 2 people and 2 drums, minor elements 2 hats and 3 drums; Elements by category — People 5 people, activities playing
In addition to the automatic alt text — “may be an image of 5 people, including Jay Youmens, people playing musical instruments, people standing, 2 hats, and 5 drums” — the user can explore the information page of detailed image description, including position, size, and identified elements by category.

Integrating accessibility into standard UX research

Our AAT research is one of many examples of answering accessibility-specific questions using traditional UX research methods. Accessibility research should also be an extension of standard UX research — same questions, same methods, but a wider range of abilities and needs of research participants.

Imagine a bell-shaped curve that depicts the distribution of user abilities. Standard UX research is often focused on the mass of the population in the middle, while accessibility research includes people closer to the ends of the curve. More often than not, improvements made for people on the edges benefit all people — a phenomenon called the “curb-cut effect.” By pushing the envelope of image/video understanding, speech recognition, and other mainstream technologies, accessibility research is elevating product experiences for everyone, not only people with disabilities.

Here are some tips for integrating accessibility into everyday UX research.

Conduct accessibility audits before engaging people with disabilities

Just as we wouldn’t field-test a product with dead links and broken buttons, we don’t field-test one with major accessibility issues. We know we won’t get high-quality data on a product that isn’t reasonably accessible.

Fortunately, a set of globally accepted accessibility guidelines can help determine readiness: the Web Content Accessibility Guidelines (WCAG). An accessibility audit should be conducted by experienced specialists to identify functional accessibility issues before research is conducted. If a product conforms well to at least the WCAG 2.1 Level AA (industry standard), it will be reasonably accessible and ready for accessibility research. Below Level A conformance, you’re likely to have difficulty obtaining good research results.

Gather disability community feedback

Facebook collects disability community feedback from a variety of sources, including, but not limited to, the Facebook Help Center, Facebook Accessibility Page, Facebook Accessibility Twitter, email, user studies, NGOs and advocacy groups, and app store user reviews. Community feedback has been instrumental in conducting desk research to identify research gaps and validate research assumptions.

Recruit inclusively and accessibly

Recruitment is often the most challenging part of accessibility research. How do we reach the desired population? How do we define both inclusive and achievable recruitment criteria? How do we design accessible screeners that don’t create unintentional exclusion? Here are a few tips:

  • Open up the screeners. Don’t intentionally exclude participants who indicate needing a screen reader or larger text, or wanting to bring in their own laptop because they need assistive software. Even better, intentionally screen-in a few participants with disabilities by asking about assistive technology use — for example, “Do you use any assistive products to use computers on a daily basis”?
  • Reach out to NGOs, advocacy groups, charities, clinics, disability organizations, or networks related to the research area. Some of them are open to research collaborations and would help spread the call for participation, or directly connect you to the desired community.
  • Consult research service providers who specialize in accessibility. Experienced research vendors might already be maintaining participant pools, or be able to reach the target population via recruitment partners, significantly reducing recruitment time.
  • Consider setting up internal research panels. If resources permit, setting up internal research panels can further shorten the timeline of participant recruitment.
  • Thoughtfully screen participants to understand their abilities and needs. Our participants’ wide spectrum of abilities and needs makes it almost impossible to effectively screen them with one simple “type of disability” question. For example, visual impairments can include no light perception, loss of peripheral vision, color blindness, glare light sensitivity, loss of depth perception, etc. Best practice is to ask how functional limitations affect the participant’s daily life and use of technology; also ask about the use of assistive technology to accommodate their functional limitations. (Note: In some countries, disability is considered medical information, so it may not be ethical to ask directly about it, and even information that is volunteered may require special handling.)
  • Provide screeners in multiple accessible formats. Depending on the target population, various formats may be preferred or required. Best practice is to be prepared to provide alternative formats upon request: phone screen, print and large print, plain-text email, accessible Word document, accessible web-based questionnaire, sign language, simplified version, and a researcher available to answer questions.

Use appropriate language

As in all user research, successful accessibility research often begins with effective communication and building rapport with participants, which requires treating participants with respect and the use of appropriate language. A general rule is to use “people-first” language (e.g., “people with disabilities” instead of “disabled people”) unless the individual indicates other preferences. We can also ask for the preferred terminology by asking questions like:

  • Which terminology do you prefer?
  • How do you want me to address you?
  • Do you identify yourself as blind, or having low vision?
  • Can you explain your visual impairment to me?

Accommodate for various needs and preferences

The biggest lesson learned from years of accessibility research is to be flexible and accommodating. People, not only those with disabilities, have various needs and preferences. Catering to a wider range of needs and preferences may add some complexity to planning and logistics, but it pays off with a more diverse sample, as well as broader and deeper insights.

  • Allow enough time. It’s always better to allow participants to complete a task at their own pace. This may include sending out consent forms days before the study, extending survey deadlines a bit, and scheduling longer testing sessions to allow time for technology setup, task completion, and bio breaks.
  • Ask for preferences and make the best accommodations. The preference may be to use their own computer because they prefer a certain display color theme or screen reader configurations that can’t be replicated on the lab computer. Other examples include the use of a teleconferencing tool that has more accurate auto-captioning, or doing an interview over email or chat because they feel more comfortable that way than in person or over the phone.

Pay participants well and accessibly

Last but not least, participants in accessibility research should be paid fairly, with choices of payment methods that work for them. Electronic research remuneration has become popular, but no single option is accessible to all. Some options require the participant to sign up for an account in order to get paid. Best practice is to provide multiple ways of compensation and be prepared to make further accommodations upon request. According to Ethnio, their participant incentive experience rating jumped from 2.7/5 to 4.8/5 after they switched from one method (Amazon or Paypal) to multiple choices (bank deposit, virtual cards, etc.).

Illustration of a person tapping their phone screen, hearing alt-text read to them about a series of images.

A long way to go

While we’re proud of the recent progress we’ve made, we recognize that there’s a tremendous amount of work yet to be done. Finding ways to tap into the power of accessibility research in all of our work is one way to boost our ability to deliver more satisfying, more meaningful experiences to every person who relies on our products.

We’re excited to announce the 2021 Facebook Accessibility Summit to coincide with Global Accessibility Awareness Day (#GAAD). This online event will be streamed on Facebook Accessibility from May 19th on 9:15–10:30am PDT. The theme is Making Digital Accessibility Accessible: Awareness, Action and Advocacy. Captioning will be provided.

Author: Yao Ding, Accessibility Researcher, at Facebook

Contributors: Chris Langston, Research Manager at Facebook; Mike Shebanek, Head of Accessibility at Facebook; and Carolyn Wei, UX Researcher at Facebook.

Illustrator: Drew Bardana

--

--