When your team first begins to work on the accessibility of your site, you might start with automated accessibility checkers like AxE, Firefox inspector, Lighthouse, SiteImprove, or WAVE. These tools can be solid starting points to find quick fixes and identify problems if you’ve never looked to improve accessibility on your site, or if it’s been a long time since you’ve last audited your site’s accessibility.
However, accessibility checkers have a significant gap: they are focused on code and strict technical compliance, not on usability, design, and content, and can’t truly identify how real people understand and interact with your website.
Accessibility checkers aren’t able to mimic how real people will interact with your website and content.
Here are some areas where automated tools will let you down.
Even for some technical items, these tools are not sufficient to be compliant. That’s because many of the standards set by the WCAG can only be evaluated manually, or are up to the discretion of an expert. Things like whether focus states function, or whether carousels are accessible, are difficult or impossible to evaluate with an automated tool due to their complexity.
Although you can meet many accessibility objectives with automated tools, a full manual review will take you much further. Through a manual review, you can build a site that can be enjoyed by a much wider audience than just those whose needs are covered by compliance standards.
They can be inaccurate
Some of the tests run by automated accessibility testing tools can be inaccurate, or give you false positives, which may make you think your site is more accessible than it is. With our site, we scored 100 (perfect score) on Google Lighthouse well before our website was genuinely accessible, especially with regard to screen readers.
They can’t make choices
Many accessibility items are-even in WCAG’s official documentation-up to the discretion of the designer or developer. For instance, there might be many different ways to do the same thing, such as navigating a menu with a keyboard.
On our site, we’ve made it so that the menu uses the tab key to move to each of the top-level items, and arrow keys to access the sub-menus. Still, WCAG also provides examples where the tab key only accesses the first menu item, and arrow keys to move left/right between the other elements.
Both of these ways are technically accessible, but one or the other might make your site or specific design more or less usable.
Automated accessibility checkers aren’t able to help you make a judgement call.
Another example of this is adding labels to items to supplement what screen readers say out loud. For instance, adding a page header label to banner landmarks in order to be more descriptive and clear.
Automated accessibility checkers aren’t able to help you make this judgement call. You’ll need real users to test to see which one is more intuitive to use. Or an expert to take a look and evaluate your design.
They can’t see
Another area where accessibility checkers often fall short is in understanding the images and graphics on your site. If you use automated checkers often, you might notice that no matter what you do, warnings about text in images or clear alt tags never disappear.
This gap is because automated checkers can’t see your images and designs to determine whether you’ve met these requirements or not. In essence, they can’t know the context and content to make this judgement call.
So, instead, a manual review of images is necessary to make sure that text is not in any of your images, and alt tags explain the content of an image in a way that provides proper context.
Additionally, an accessibility checker can’t tell you how screen readers are reading out your content in real life beyond a simple “is it reading something” check.
For instance, if SVG settings aren’t set up for screen readers by setting the inline SVGs to have a role=”img” tag, the screen reader will try to announce the tags inside it, which results in announcing repetitive “IMAGE” tags over and over. This example is a terrible experience for people who use screen readers, and it’s unlikely an automated checker would let you know that this is occurring.
They can’t read
As far as the written content on your site, this is where automated accessibility checkers are the most limited.
Automated accessibility tools do not consider an inclusive lens, which can leave your site open to alienating potential users or inadvertently making content that excludes people. No automated tool can determine how your site makes users feel, or if they are feeling excluded by your content or the feel of your website.
The words and images you use on your website impact how a user feels, and an accessibility checker can’t analyze those words and images.
The combination of the words and imagery you use have context associated with them that makes users feel a certain way, from a spectrum of delighted to offended. A purely-automated accessibility tool can never understand this.
A more suitable process
So, if automated accessibility testing tools won’t get your site to where it needs to be, what should you do?
The best process is a comprehensive design, code, and content process, where you and your team work to ensure your site is as usable, accessible, and inclusive as possible. Then plan to review, improve, and test these items continually (internally or, preferably, with users). Using this process will help you holistically improve not just your site’s accessibility, but the general usability and performance of your website.
Most importantly, moving beyond automatically checking code-and embracing design, interaction, and usability considerations, alongside inclusive content-will open up your site to more potential users and help convert more of the users who are currently visiting.
Go beyond automated accessibility checkers with the Essential Website Audit
Make your website more usable, accessible, and inclusive for all with the Essential Website Audit. Don’t miss out on expanding your reach and engagement!
Originally published at https://sayyeah.com on May 6, 2020.