12 Issues Automated Web Accessibility Checkers Can’t Detect

Colleen Gratzer
Design Domination
Published in
7 min readSep 10, 2021

If you’re relying on automated web accessibility tools, you’re doing accessibility wrong. Just because a website passes an automated checker doesn’t mean it’s accessible. It could be filled with major accessibility errors. Find out 12 issues automated web accessibility checkers can’t detect.

I’m here to preach: If you’re relying on automated web accessibility tools to test the accessibility of your websites, you’re approaching accessibility all wrong! You’re putting your clients — and yourself — at risk.

Stick around to find out 12 issues automated web accessibility checkers can’t find.

What an Automated Web Accessibility Checker or Web Accessibility Testing Tool Is

First, let me get into what an automated web accessibility checker is. It might also be referred to as a web accessibility tester or testing tool.

It’s a website, browser extension, plugin or other tool that scans a website for accessibility issues. They check for certain elements such as:

  • sufficient or insufficient color contrast,
  • the presence of Alt-text,
  • the presence of form field labels,
  • skipped heading levels, and
  • empty hyperlinks and buttons.

Some of these tools include:

The Problem With Automated Web Accessibility Tools

The problem with automated web accessibility tools isn’t the tools themselves.

The problem is that there are many web designers and developers who don’t know anything about accessibility or know very little about it and they’re solely relying on these automated web accessibility tools to test the accessibility of their websites.

If you’re one of them and you think that if you simply address the issues found by automated checkers that you’re in the clear, think again. You’re not!

First of all, automated checkers have limitations. They can only detect about 25% to 30% or so of accessibility issues. The remainder of issues — 75% to 80% — require manual checking, by a person.

If a website passes an automated checker, it doesn’t mean the site is accessible.

In fact, a website can pass one or more automated checkers but still be an inaccessible website filled with major accessibility issues. Not all checkers find the same errors as one another, but, again, they can only detect a small percentage of potential issues.

I’ll tell you: I recently did a website accessibility audit, where an automated checker found only two accessibility errors outside of contrast errors. My report after doing the audit was 140 pages long. There were a few introductory pages in the report as well as some explanations of errors that took up more than one page. But for the most part, there was one error listed per page. So there were well over a hundred and some errors.

Another thing is that automated accessibility testers can give false positives. You might think you need to fix something you don’t actually need to fix. In some cases, it could result in reworking your design when you don’t need to. That costs you and your client unnecessary time and money.

Automated accessibility testing tools also can’t tell you if certain things were done properly.

Errors Automated Web Accessibility Tools Can’t Detect

Let’s take a look at some of these accessibility errors automated tools cannot detect. This is by no means an all-inclusive list. This is just a small list, but these greatly impact how a user with a disability will experience the site — if they’re able to at all.

1. Poor Alt-text

For instance, an automated checker can tell if Alt-text is present on an image. But it cannot tell if it is descriptive enough, sufficient, too long or if it should be empty instead.

For example, the Alt-text could have the filename. Does it have Alt-text? Yeah. But it’s not descriptive.

Maybe the Alt-text is completely wrong, and it says “Banana” instead of “Apple.”

Maybe it says “Apple” but should be much more descriptive. It depends on how it’s being used.

Maybe there’s an infographic on the page that says what it’s an infographic of. That may or may not be OK. In most cases, it’s probably not acceptable, but it depends on what other content may or may not be present on the page.

On the flip side, a decorative image that shouldn’t have Alt-text might have it. I see this from time to time. Someone’s included Alt-text that describes a decorative border, for example.

Alt-text has been the focus of a number of website accessibility lawsuits.

2. Poor Page Titles, Headings and Hyperlink Text

Like with Alt-text, an automated checker can tell if a page title or hyperlink has content. But it cannot tell if a page title or hyperlink uses appropriate or meaningful text. The same goes for headings.

As any user — sighted users with or without a disability and also non-sighted users — goes around a website and individual pages, the accuracy and descriptiveness of page titles, headings and hyperlinked text are important.

3. Improper Use of Tags

An automated checker cannot tell if the proper HTML tag was used for an element.

For instance, maybe paragraph tags were used for what should be a list of items.

Maybe what should be body text is a heading tag instead. Maybe what should be a heading is instead tagged as body text. Maybe a table was used for layout purposes.

All of these require review by a person.

4. Incorrect Reading Order

Correct reading order is another issue an automated checker cannot detect.

The order in which a sighted user reads something on a page is not necessarily how a user of assistive technology experiences it.

I’ve done website audits where the order for a screen reader user went from the H1 at the top of the page to supplementary text in the righthand column, to the main text in the lefthand column and then to a sidebar.

The way this read did not make sense to someone who cannot see the page and where things appeared.

I’ve also seen where a page was read in columns, as you would expect, but this didn’t make for a good experience for a screen reader user. They were getting important content too late on the page.

5. Improperly Hidden Content

Automated checkers cannot make the determination about hidden content — what should or shouldn’t be hidden from all users or what should or shouldn’t be hidden from assistive technology. Usually, this is done with ARIA or CSS, and it’s often done incorrectly.

6. Unreadable Content

This has several meanings. The content may not be at the proper reading level for the audience.

There’s also the issue of content that becomes unreadable when the sighted user changes the settings for their text size or text spacing, for example.

Not only that, but then there’s the contrast of meaningful information that may appear in images.

7. Lack of or Poor Keyboard Accessibility

Keyboard accessibility is another issue. This is huge!

Sometimes developers add code that prevents keyboard users from getting around a site or to certain elements on a web page. Or they might be in the wrong order.

8. Poor Form Labels and Instructions

When it comes to forms, automated accessibility checkers can tell if a form field has labels such as “First name,” “Last name,” “Email” and so forth. But they cannot tell if those labels are clear and helpful to the person filling out the form.

They also cannot tell if clear instructions have been provided to assist the user in filling out the form.

9. Poor Form Error Messages

Likewise, they cannot tell if form error messages are clear and easy to understand. Oftentimes, they are vague as to what the user needs to do to correct the error.

Sometimes, it’s not clear to any user — sighted users or users of screen readers — which field even has an error.

10. Lack of and Poor Quality of Captions and Transcripts

When it comes to audio, some automated tools might be able to detect if the word “transcript” is on the page but not if a transcript is really present. And then the question is: is the transcript accurate?

With video, automated tools cannot detect if captions, an audio description and/or a transcript is needed. It depends on the content of the video file and other content that might be on the page. Again, though, if those are present, are they accurate?

11. Insufficient Accessibility of Downloadable Files

Automated checkers do not check if linked files are accessible or not. By linked files, I mean PDFs, Word documents, PowerPoint files and so forth.

For PDFs, Acrobat’s built-in accessibility checker can detect some errors. Even so, that checker can only detect about 25% to 30% of potential issues.

12. Poor Usability

It’s important to understand that the purpose of accessibility isn’t to appease a checker. It’s to ensure a good user experience.

Automated accessibility testers don’t test for usability. That must be done by a human.

Why Using Automated Accessibility Tools to Test Web Accessibility Isn’t Enough

Relying on automated checkers provides a false sense of security. They are meant to assist, not replace, the knowledge you need to have in accessibility.

Also, that knowledge helps you make the determination of whether or not a checker is right or wrong with the issues it can find.

Not only can automated accessibility checkers not check for every potential accessibility issue, you could hypothetically check off a list of WCAG guidelines and still not end up with an accessible site.

Manual checks and tests are a must! If you do otherwise, you’re not even half-assing it. You’re quarter- or third-assing it.

Effective accessibility testing involves a combination of:

  • using automated checkers by someone who understands accessibility and knows what to address and what to ignore,
  • manual checking by someone who understands accessibility,
  • testing with assistive technology and —
  • whenever possible — user testing.

Find out other mistakes you might be making when building accessible websites. Get my free guide, 5 Mistakes Developers Make When Building Accessible Websites.

Originally published as a podcast and transcript at https://creative-boost.com/automated-web-accessibility-checkers/

--

--

Colleen Gratzer
Design Domination

Host of the Design Domination podcast, mentor to designers and accessibility teacher at Creative Boost