Planning for Web Accessibility Testing

Ugi Kutluoglu
6 min readSep 25, 2018

--

Ask ten accessibility professionals what accessibility testing is, and, you’ll get ten different answers. Some will emphasize usability. Some will tell you it’s a code quality problem. Others may say it’s all about compliance and check lists. I am one of those professionals. I’ve been doing this for years and yet I still find it hard to answer this question.

No, it’s not because we don’t have standards. In fact, we have one of the best guidelines ever written in the history of Information Technology. We have decades old companies in the industry whose sole job is to test and fix accessibility issues. We have a lively conference and meet-up scene, complete with super-star speakers and mentors. Yet, we fight on Internet chat channels over whether an element should have been coded as a button or a link. So folks, have a seat and enjoy your popcorn. We are an interesting bunch to watch, to say the least.

Web Accessibility is a hybrid discipline. You need to wear many hats, especially if you are on your own. You are expected to be a tester, a developer, and a designer all at the same time. That’s probably why it’s still somewhat of a niche field to work in.

In this article, I’ll tell you how I plan for accessibility testing at my day job. As you browse the web for accessibility solutions, you’ll see tons of conflicting ideas, advice and code. After all, if accessibility professionals had to pick a motto, it’d probably be “It depends”. So, remember above all else to take everything in this article with a grain of salt.

Creating an Accessibility Test Plan

Test plans are important — especially if there’s a chance that someone other than you will be running the same test in the future. There’s no need to go into much detail, but, from my perspective, they are good for a couple other reasons in addition to their obvious benefits.

First, test plans help you gauge what needs to be tested and how long the test is going to take. They save time, because you will not have to hunt for the same information again and again. They help you split the work effectively if you need to. But, they can also be handy if you get into trouble. When your client (or manager) asks why a certain feature was not tested, you can safely point to it and say “is this your signature?” Needless to say, always make sure to get your test plans verified and signed.

A simple accessibility test plan…

The Essentials

An accessibility test plan isn’t terribly different from any other software test plan. As in any good plan, it should include an overview of the project, scope, schedule, approach, resources, and success criteria at the minimum.

Start with an identifier: This can be the name of the project, the component to be tested, or a serial number. An identifier is something that will help you locate the file among hundreds of others five months later.

Give a little background: It is always a good idea to provide a brief overview of the project or the component that’s going to be tested. Include what the component does, what has changed since the last update, and why.

The Metadata: This is probably the feature of a test plan that can save you the most time. Use a table to list essential information about the project. This can include the URL, name of the developer, name of the project manager, the team responsible for the project, tags to use when filing a defect, and links to other resources.

The Component List

This is where the test plan deviates from a regular plan. Assuming that the test plan is intended to cover WCAG (Web Content Accessibility Guidelines) success criteria, having a component list based on the items in WCAG QuickRef’s filtering feature will help you a lot.

If your page has the following elements, make sure they are present in your component breakdown so that you can easily locate relevant success criteria and execute tests.

  • audio (audio playback with or without visible controls)
  • buttons
  • captcha
  • captions (image and table captions)
  • carousels
  • controls (anything other than buttons and links. i.e sliders, resize handles)
  • errors (form errors, warning messages)
  • forms (forms and conventional form elements)
  • headings (H1 — H6)
  • hidden-content (visually hidden content that’s available to assistive tech)
  • iframes (and frames if you’re old school)
  • images and images-of-text
  • labels (forms labels)
  • links
  • lists
  • menus (drop-downs and application menus)
  • modals
  • moving-content (marquee, anyone?)
  • navigation
  • progress-steps
  • skip-to-content
  • structure and sections (sections, headers, footers)
  • tables (data tables)
  • text (paragraph text)
  • video (video playback with or without visible controls)

I personally like a nested list of components. It makes it easier for me to follow and complete relevant test steps for each component. I start with the main container, and then add every major section and component as a nested list. Each line starts with the name of the component and its type (i.e “Country Selection Drop-Down”).

If a component has multiple variations, states, or if you’re running an A/B test, it’s a good idea to note these on the plan as well.

To WCAG, or not to WCAG…

If you’re testing for WCAG conformance, adding additional test steps is a bit redundant. Simply, refer to relevant WCAG guideline to keep your test plan neat and tidy.

If you’re not exclusively testing for WCAG and would like to add test steps, here are some tips for writing those cases.

  • Preconditions: Do I need to enable any special settings or use assistive technology to complete this test?
  • Steps to Take: What steps should I take to complete this test?
  • Expected Result: How should the component behave? What should the output of the Assistive Technology be like?
  • Measure of Success: What is the criteria for marking the test as passed?
A Sample test case for Keyboard Use

Methodology

Once the essential information and test cases are in the test plan, it’s time to declare what platforms and assistive technologies you’re going to test with.

Remember, WCAG is platform agnostic, so it won’t actually tell you which browser or assistive technology to use for testing. Here are some common combinations to consider:

  • Safari and VoiceOver on OSX
  • Firefox and NVDA on Windows
  • JAWS and Internet Explorer on Windows
  • Safari and VoiceOver on iOS
  • Talkback and Chrome on Android
  • 200% Text Zoom on Firefox with ZoomPage Extension
  • IE with Windows Hight Contrast Mode enabled
  • Dragon Naturally Speaking with IE on Windows
  • Switch Access on Android
“person holding black iPad” by Taras Shypka on Unsplash

When to Automate?

Some accessibility checks can be repetitive and tedious. Nobody likes to check 300 links on a web page for color contrast. Luckily, automation tools can help us with that. You can never run an automated test and deem a web page accessible. But, they can be super helpful for checking things like color contrast, whether your images have text alternatives, and if the Aria property on the element meets standards.

WCAG Section on a test plan

Automation has it caveats. For example, WCAG requires that text has adequate color contrast at all times. But, none of the automated tools that I’ve used can check for hover state. So, an automated tool will not report a failure if the link that you’re testing fails color contrast requirement on mouse hover. You have to test that manually.

My favorite automation library is AxE. You can install it as a browser plug-in or integrate it to your favorite testing framework. There’s also Tenon and Pa11y.

I hope that provides some insight into how I create an accessibility test plan. It’s not perfect, but it works. Oh, I almost forgot. You can view my sample accessibility test plan on Scribd. Please feel free to let me know if you have suggestions, want to share your test plans, or if you have any concerns.

--

--

Ugi Kutluoglu

Web developer with a focus on accessibility and usability.