Comparing 3 Top Automated Accessibility Testing Tools: WAVE,, and Google Lighthouse

Automated tools for testing accessibility have evolved to a point where you don’t have to look hard to find dozens of examples of thorough and reliable tools that test for adherence to Web Content Accessibility Guidelines. There are browser extensions, command-line tools, tools that can be integrated with continuous integration systems…

At every level of integration, these tools provide a good starting point for testing, point to patterns of errors that humans can then look out for, and can catch issues before they go to production.

Consider the following scenarios:

  1. A content editor removes the contents of an h2 tag, unaware that the tag remains, causing confusion for screen reader users.
  2. The theme of a website contains a ‘skip to content’ link at the top, that linked to a ‘content’ div out of the box. The developer renamed that div, breaking the link. Now keyboard-only users have no way to skip to content, and have to tab through every navigation link.
  3. A designer prefers placeholder text to a field label, unaware that this may display an unlabeled field to those using a screen reader.

These issues (and many more) are very easy to miss — and they are very common. If you’re not using an automated testing tool, your site probably has similar issues.

I spent several weeks comparing these three tools in preparation for a presentation at DrupalCon Nashville. Each has a browser extension and an array of options for testing on the command line with continuous integration possibilities. They are all effective for testing a wide range of issues. The differences mainly lie in the user experience and in the scope of the tools.

Note: I used the DrupalCon Nashville website to do a live demo of the tools. It worked well because our audience was familiar with the site and there are a few accessibility errors present. It should be stated that such errors are quite common in event sites, which are generally hastily developed and not fully tested (no shame intended for the developers of the website). I am including a few screenshots from that demo here.

Browser Extensions

The browser extension is the simplest application of these tools. You don’t have to have any technical expertise in order to use them, you just browse to whatever page you want to test and click a button. Each tool generates a report that flags accessibility issues, describes them, and gives guidance on how to fix them. Another advantage of browser extensions is that they are free. Here is a brief look at the three I compared:

WAVE annotates errors and warnings in-line, which allows you to know instantly what element on your page is causing the issue and a panel at the bottom allows you to see annotated HTML, which points directly to the offending code. Clicking on the error icons pops up information about the error, with additional information available in the sidebar — what it is, which WCAG guidelines cover it, how it affects various types of users, and a recommendation on how to fix it. Although the colourful error and warning icons seem to belong to a bygone era, WAVE is full-featured and fun to use. WAVE was developed by WebAIM at Utah State University in 2001, and is a venerated tool.

Screenshot of WAVE browser extension testing

Tenon’s browser extension delivers a report that displays a summary of the issues, and for each error displays a code snippet that allows you to find it in your code, along with the WCAG guideline it pertains to and a link to a recommended fix. The Tenon report is geared slightly more toward developers, and although you can run a number of reports for free, if you click on ‘recommended fix’ for any issue in the report, Tenon will prompt you to log into your account or create an account. Creating an account will give you access to a robust dashboard where you can store and retrieve the results of all of your testing, run reports that show percentages of types of errors over time, and more. However, to use these features you will need to buy credits.

Screenshot of browser extension testing

The Google Lighthouse accessibility audit is one of a suite of audits that you can run within Chrome Developer Tools, along with Performance, Progressive Web App, Best Practices, and SEO. The Lighthouse accessibility audit is based on Deque’s aXe core rules engine. Running the audit generates a comprehensive report that gives information on all of the tests that passed in addition to the ones that failed. You can optionally save the report in JSON format to view later or to send to someone else — Google Lighthouse has an online report viewer that you can use by simply dragging a report onto it, making the report accessible from anywhere, without an account.

Screenshot of Google Lighthouse Chrome developer tool auditing

Beyond Browser Extensions

The limitations of browser extensions are that they always involve manual control. You need a person to navigate to each URL and to interpret the results. If you are in a hurry to publish, testing of this kind can get skipped. Testing an entire website would be a lengthy process. But there is another way!

The major tools, including all three of the tools discussed here, can be accessed as APIs from the command line, which can be extended for use in continuous integration. If you have the means to take your automated testing to the next level, a continuous integration solution can give you all of the benefits with a fraction of the time spent looking at report results.

With a continuous integration solution, for example for use in a Travis or Jenkins build, there are several options for triggering an alert when errors arise. From an email to a Travis alert in your Github pull-requests to having the build fail, every option would help you to ensure that errors are remedied before the code goes live.

The reason to choose one tool over another would be major capabilities such as spidering, and the scope of the project. If you have an enterprise project that needs one or more sites audited regularly, with robust reporting, WAVE or Tenon could handle the task. Both WAVE and Tenon charge for API credits.

Tenon has many options for integration, including a feature called Projects API that allows you to have an unlimited number of projects that you can track in your account. You can post a project to the API from the command line, and optionally choose ‘spider’ or ‘api’ as the project type. With Spider, you submit a base URL and it crawls the site and adds all URLs to the report. With API, you can either specify a list of URLs, or the URL of a sitemap. Creating a new project in Projects API initiates either a crawl of your site or analysis of your list of URLs (or the ones contained in the sitemap). A spider could be triggered from a CI build, and the resulting reports could be tracked over time.

WAVE API has a standalone API that analyses pages in a self-contained headless browser and can be licensed to run on your own server. So, it could be potentially be integrated into a CI and testing could be run there on any number of pages.

Google Lighthouse is available as a node module that can be used on the command line or within a continuous integration environment, and has the distinct advantage of being free, which makes it a good candidate for projects with no budget for accessibility. Its limitations are that it has no spidering or extended reporting capabilities. It analyzes a single URL, and delivers a report on it. But it can run using headless Chrome, which means that it can perform a full analysis within a Travis or Jenkins build. You could run it on any number of pages, such as a handful of representative URLs (a landing page, a form, a blog page…). If any errors arose, they could be indicative of larger patterns to look out for.

An Important Caveat

Before you leap to the conclusion that an automated accessibility tool is going to do all of the work that is required to make your website accessible, we need to talk about what these tools cannot do. If your aim is to provide a truly accessible experience for all users, a deeper understanding of the issues and the intent behind the guidelines is required.

The advantages and limitations of automated tools: a few examples

Automated Tools Will Suffice for the following:

  • Does the image have alt text?
  • Does the form field have a label and description?
  • Does the content have headings?
  • Is the HTML valid?
  • Does the application UI follow WCAG guideline X?

Human Intervention Is Required for the following:

  • Is the alt text accurate given the context?
  • Is the description easy to understand?
  • Do the headings represent the correct hierarchy?
  • Is the HTML semantic?
  • Does the application UI behave as expected?

While automated tools can discover many critical issues, there are even more issues that need human analysis. If there is a vague or misleading alt tag, that is just as bad from a usability standpoint as a missing alt tag, and an automated tool won’t catch it. You need someone to go through the site with a sharp and trained eye.

In the end, to fully test for accessibility issues, a combination of automated and manual testing is required. Here is a great article on key factors in making your website accessible. Automated testing tools provide a good starting point for testing, point to patterns of errors that humans can then look out for, and can catch issues before they go to production.

Whether you are a developer, a QA tester, or a site owner, and whatever your technical skill level is, automated accessibility testing tools can and should become an indispensable part of your testing toolkit.