Which accessibility testing tool should you use?

Paul Stanton
Pulsar
Published in
11 min readMay 17, 2018

--

In preparation for Global Accessibility Awareness Day, the Pulsar team has been on an accessibility kick recently, doing various things to improve the accessibility of the Pulsar design system and the software it serves.

I took some time to experiment with a handful of popular accessibility testing extensions and tools which we use to validate the accessibility of our user interfaces. These tools will give you a good foundation of accessibility before you move onto user-centric testing with real people and/or full blown accessibility audits. You should consider using these tools to test accessibility much the same way as you test your markup using the W3C validators which has always been one of the most basic tests we all perform before go-live.

Browser extensions (I’m using Chrome for the purposes of this post)

Other tools

  • Tenon.io (free and paid plans available)

Starting our test

I needed a real user interface to test, something fairly small with a handful of form elements and interactions, so I started where all our Continuum users start, our sign-in screen. There’s a small amount of interaction design here, users will move between a few different forms in-place but there’s not an awful lot of markup involved, so for the purposes of this post it’s a good example of how we can work through the various issues flagged by the tools.

Our sign-in UI, about to be put through its paces

First, let’s see what each tool tells us before we make any changes. If the tool gives me the option to filter results I will be setting it to show me anything related to the Web Content Accessibility Guidelines 2.0 at AA compliance level, although in our real-world testing we’re also interested in looking at Section 508 compliance for the US market.

WAVE Web Accessibility Evaluation Tool

This is usually the first browser extension developers think of when accessibility testing comes up, but as I’ll show in the post, it’s probably not the first one you should reach for.

Hitting the WAVE button in Chrome’s extensions toolbar displays the WAVE toolbar as a column inside your window, here it shows me 1 error and 9 alerts. Previously we’ve only focused on resolving ‘errors’ in our internal testing but we’re now increasing our scope to ‘alerts’ also.

WAVE overlays an icon for each issue on the UI but it gets confused by absolute positioning and doesn’t show any other information about the related element, such as markup or ID/class attributes. This is somewhat problematic if you have issues on UI partials that are not yet visible, such as hidden forms. It takes some hunting around using dev tools to pinpoint the element causing the error. Alternatively, as suggested by Charles Hall in the comments below (Lead developer of WAVE) you can toggle the ‘no styles’ mode which will give you the plain old HTML view of your UI but still show WAVE icons next to the related elements. This also has the handy effect of displaying any UI partials that are visually hidden with CSS.

Inspecting the red icon in the UI leads me to the related element (the `.signin-brand` image, missing an `alt` attribute

Lighthouse

If you’re running an up-to date version of Chrome, you probably already have Lighthouse because it’s built right in! Open devtools and go to the ‘Audits’ tab, hit the ‘Perform an Audit’ button and you’re given a list of the audits that Lighthouse can perform, we’re only going to run the accessibility audit to save time.

Lighthouse is a bit more lenient than WAVE for this sample UI, only complaining about the alt and tabindex issues. It does show affected markup but doesn’t do any highlighting or jumping to the affected elements in devtools. It’s also the slowest test covered here.

Interestingly, Lighthouse uses axe-core (which we talk about next) for its accessibility audit, but I suspect doesn’t run the full set of ~70 tests that the aXe extension does, I need to look into this a bit more…

aXe Browser Extension

I’m a big, big fan of aXe, the extension adds a new tab to Chrome’s Dev Tools with a big blue ‘analyze’ button, once you hit that you’re shown a very nice list of issues (I’ve filtered to just show violations here, but there’s also a list of other things to review) with really useful related information.

The highlight button does a much better job of showing the element related to the violation

Each issue show the related markup clearly, hitting ‘inspect node’ jumps right back into the elements tab of DevTools highlighting the element.

aXe rates the impact of a11y issues differently to WAVE, in this example the alt-text issue is critical, the tabindex issue is serious and the others are moderate. It’s worth noting that this presentation compells me to resolve all of the violations because it tells me there’s nine of them, rather than WAVE telling me there’s one error and nine alerts.

aXe also lists things for review, which don’t specifically cause a violation of accessibility guidance but may need to be considered based on the actual context of the element within the UI. The galaxy image behind our sign-in UI is actually a video which doesn’t particularly need captions or an audio-description track, but perhaps it needs marking as purely presentational only. I can’t do this yet because we use a third-party library to inject the <video> element, but it’s something for me to review in the future.

Interestingly, aXe is smart enough to know that the blue box containing our form has opacity (0.9) with an image behind it. The colour contrast issues are flagged because the tool can’t guarantee the background colour would allow the required contrast level of the foreground text to be met (it does, but it’s useful to be reminded to check).

WCAG Accessibility Audit Developer UI

It only checked four things, and passed them all. In the bin with you.

I uninstalled this one.

SiteImprove Accessibility Checker

SiteImprove is very popular in the waters in which we sail (local & central government, higher education, not-for-profit etc…) so it’s very useful to have a SiteImprove tool to verify the accessibility of our UIs.

This tool gives a lot of relevant information about issues, though I find the way it’s listed to be a bit more difficult to scan and you need to click into an issue to get the detail specific to that issue, then back out to the main list to see your issues. I do prefer aXe’s master/detail view but SiteImprove does the best it can with the single column constraint.

Very similar results to aXe, although that aria-atomic is a new one! (review issues are filtered out)

Tenon.io

Tenon is different as it’s a web service you can use much like the W3C HTML Validator we all know and love, but for accessibility. Simply give it a link or paste in the markup of your UI and it’ll generate a report for you. There‘s multiple (paid) ways to integrate Tenon with your build tooling or CI servers, but that’s meat for another blog post.

It’s slower than the in-browser tests, but the main caveat when passing the URL to the browser version is that it needs your site/UI to be publicly available to Tenon. For now I’m just going to use ngrok to create a temporary public URL to my localhost, and give that link to Tenon.

Pretty much the same results as the others, nicely presented with markup examples.

The scores on the accessible doors?

Barring one shambolic effort, all of the tools gave fairly consistent results for this—admittedly limited—UI test. For me to give them some form of ranking it came down to how the information is presented and what tools it gives me to resolve the issues it raises. Realistically there’s no silver bullet and we’ll likely use most if not all of these tools to sanity check each UI, but the first tool I use in nearly all cases is the aXe browser extension in Chrome. The information is clear, well organised and the highlighting and dev tools integration make it the best tool out of the ones I’ve tested so far. I’m also very interested in integrating aXe core or Tenon.io into our CI process in the near future, for more automated on-going testing.

So in terms of results, my personal order of usefulness is:

  1. aXe
  2. SiteImprove
  3. Tenon
  4. WAVE
  5. Lighthouse*

*I wouldn’t write-off Lighthouse as irrelevant, as long as Google keep updating the number of audits it will continue to improve.

The fixes

So I know I have things to fix, and to be honest I’m not going to bore you with those, I’m only going to use aXe during my initial pass and then see what the other tools tell me.

The fixes, in brief, involved:

  • Adding alt-text to the ‘Continuum’ brandmark image
  • Add <main> wrapper around the content to act as an aria-landmark region
  • Change the ‘Pulsar’ text into a h1 heading (with CSS fixes to maintain size)
  • Remove a bit of javascript which added incrementing tabindex values (1, 2, 3 etc) in favour of simply 0 or -1
  • Added id attributes to the ‘forgot password’ and ‘sign in’ forms to act as the target for the related skip-links

After those fixes, let’s take them for another spin in the testing tools.

aXe = No violations 🎉

WAVE = no errors, no alerts 🎉

Lighthouse = no errors 🎉

SiteImprove 💥

OK, so SiteImprove still isn’t happy…

First, let’s drill into the issue with contrast, earlier using aXe we saw how it couldn’t figure out the colour of our background because of the opacity of the container, but that’s not what SiteImprove is complaining about…

SiteImprove thinks that the background of all these elements is white, in all honesty I’m not sure why but my first thought was that perhaps it’s ignoring the video and the fallback image isn’t loading, but the background colour on the underlying body element is, of course, white. So setting an explicit colour for the container underneath the video keeps SiteImprove happy in this instance, and catches any situations where both the video and fallback image fail to load while keeping the white text visible. A worthwhile fix.

Next up, bypass blocks.

SiteImprove wants me to put a top-level skip-to-content link in which we could do with across our entire UI, so that’s an easy one to check off.

The ‘Enter your username and password’ alert is an element with role=”alert”

Lastly, that aria-atomic thing, we use a aria-live region using role="alert" (the ‘Pulsar’ text) to change to display warnings or errors as the user is attempting to sign in, any change to the content here is announced by screenreaders.

Adding aria-atomic="true" makes sure the screenreader announces the whole region if part of it changes, we don’t actually need it in this case but SiteImprove doesn’t know we’ll always update the whole region, so it’s good to reassure the tool that it’s going to happen and to guarantee the behaviour in screenreaders.

So a few more quick tweaks and we’re done! SiteImprove is happy!

Tenon.io 💥

Lastly, a few more nitpicky bits from Tenon…

The first two are in relation to field labels not being unique. Both our sign-in and forgotten password forms ask for a username and Tenon gives us the correct advice about wrapping them in a fieldset, however after doing so it still flags this as an error at the time of writing. I believe this is a bug and have fed this back to Tenon support.

*Update* Since publishing, Tenon have confirmed this is a bug and will fix it in an upcoming release.

The other last one is complaining about a href="#" in some example code for this UI, Tenon complains loudly about absolutely every instance of using a hash symbol as a href so if you’re wanting to test prototype, sample or demo code you’ll find yourself mopping these up all over the place.If you’re using <a href="#">something like this</a> as a trigger for javascript behaviour you should consider using a <button> instead, or if it absolutely has to be a link then maybe you need an actual link as a fallback for when js doesn’t load instead of an empty hash. There’s normally never an actual time for you to use href="#" in production code, so it’s good that Tenon catches this.

Findings

After experimenting with this range of browser based accessibility testing tools, the most important thing to take away is that no one tool gave me a complete list of issues found by the others, you really do have to test in multiple tools.

I’ve pretty much adopted the aXe browser extension as my primary testing tool, once I’ve resolved any issues there I move onto SiteImprove, Tenon.io and then WAVE. This allows me to clear up a great deal of accessibility issues in our user interfaces and clears the decks for us to perform device testing with assistive technologies such as screen readers. More on that soon.

And once more, for those skimming to the end of the article to find the answer sheet, my personal order of usefulness is:

  1. aXe
  2. SiteImprove
  3. Tenon
  4. WAVE
  5. Lighthouse

If there’s any other tools or methods you use to good effect, please do let me know!

--

--

Paul Stanton
Pulsar

I design things, I build things. — UI/UX person @Jadu