Can we use web analytics to improve accessibility?

How do we know that our web site actually works the way we want? We use web analytics. Web analytics tells us not just how many users we have, but also how they behave; if they use the site the way we expect, if any pages seems hard to use, and if the changes we make actually makes it better. I wanted to use web analytics to see if the site also works for people who use assistive technologies. That turned out to be harder than expected. Why?

This article is about the possibilities and concerns about using web analytics to detect accessibility flaws on the web. Why is it so difficult, what are we missing because of this, and is it at all possible? Finally I will present the findings from some experiments I’ve had.

Why is it difficult?

The web log gives us a lot of technical information about the users, like the device they are using, which browser of which version and so on. But it doesn’t tell if the user is using a screen reader. We know how the site works for someone who is still using old versions of Internet Explorer, but we don’t know if it works just as good for people who can’t see the content, as for people who can.

This might seem pretty strange. Why shouldn’t people with disabilities have a say when we try to improve our site? There are numerous reasons for this, but I will present what I find to be the two main reasons.

They don’t want to be treated differently.

Some websites have a separate set of accessible pages in addition to the cool and fancy one everyone else are using, and by detecting assistive technologies they can be directed there automatically. But these users don’t want to be treated as second rate users. They want to be where everybody else are!

I think that’s a bad idea. That’s twice as many pages to maintain, and my bet is that one of the versions will fall behind when new features are added. We all know which version that will be.

They worry about their privacy

…just like most of us these days. Information about users has become business, and that makes people worried. People with disabilities don’t want to share that information on top of this. Because, if you have a disability and apply for a job online, and the employer is able to see that, how would that information affect your chances to be called in for an interview?

If you somehow are trying to single out these users, use it only to improve their user experience. And never use it in a way that can be used against them.

The lost benefits

But why do we want to do this, are there any lost benefits? If you seriously want to include all users, you shouldn’t just look at the majority. You should separate them into groups. You can easily split them into desktops users and smartphone users, because they are likely to use the site differently. But it could also be interesting to have different groups of users with disabilities. If you deploy an update on your site, measure the response and the majority looks super happy, it’s easy to miss that this small group is not. They simply drown in the crowd. If we could isolate these as one group, it would be easier to see how it works for them as well.

Are there other ways?

So we know that assistive technologies don’t leave any traces in the web log. Are there other ways that we can tell if AT are used? I will present two possible ways of doing this. However, I’ve only tested one of them.

I have also rated the concerns about ethics, accuracy, work load and volume for both of these methods with a smiley, a sceptical face and a sad face.

Getting the users’ consent

The first option is to simply ask them. This could be a part of the users privacy settings, where we nicely ask if they have a disability that affects their web experience, and if it’s ok to use their traffic data to improve it. Then we can ask what kind of disability they have and their age. But we should also ask if they want to visit us for a user test, because these people are hard to find, and we need a wider specter for our user tests.

So how do I score the different considerations for this approach? The ethics should be a pure smiley. We only watch those who actively approve that we use their data. Accuracy should also be good. They tell us what their disabilities are, so we won’t have to guess. The work effort however, might be a bit high, but if this can be an extension of the existing gdpr framework, it should be bearable. But for the volume, I’m more pessimistic. This is a small group to begin with, and people usually don’t swarm to submit to these kind of things. If we want sufficient volume, we need to actively recruit people, not just put a checkbox on a page.

But this is a method that I haven’t tested, so I may be wrong. Except from age, that’s information that actually was available.

Recognize AT users by behaviour.

The other approach is to try to recognize them by looking for typical behaviour. This is a quite difficult and not very accurate method, so it won’t give us exact data, but perhaps good enough to spot differences between the user groups. Such behaviour could be scaling a web page. Or the one I’ve tested, track focus events on skip links.

Our skip links are only visible when they are in focus, which means they are never visible for users who use mouse or touch devices, but pop up for users with screen readers or who navigate with the keyboard. They can use these to skip headings and navigational stuff, and go straight to the main content.

Logging this event is quite simple. You add an event listener to the link’s focus and click events, then you can track that event for web analytics. Then you go to your web analytics reports and see if people who use this link for example have lower click rates than other users.

So are there any problems with this approach? The ethics are a bit dubious. In web analytics, all users are anonymous, but we are talking about small groups, and they do not necessarily know that we are watching them, even though that should be mentioned in the privacy declaration, which they don’t read anyway. Accuracy is also not very precise. We know that they somehow touch the skip links, but we don’t know why. The work effort however, is quite low, granted that you already have a web analytics framework that allows tracking events. I also put a smiley on volume, even if that is really small. We reach the people we want to reach.

That was the two methods I wanted to mention, but there actually is one more. If you don’t want to create this yourself, you can purchase one. A company called Level Access have developed a tool named Access Analytics.

Findings

Let’s have a look at what this script actually is logging. This report is key data for skip link users and people in the age between 80 and 90, compared to everyone else. This tells us that the number of visits with skip link focus is just a tiny fraction of the total. Actually just 0.15%. Still, the number is actually quite high. 129000. But the number of people who click them is just half a percent of that again. This number is too low to provide any reliable reports, so I won’t pay much attention to clicks, only focus events.

A table showing that 0.15 % of the visits show a skip link, and that 0.5 % of those click them.

This graph describes how many page views the different groups have per visit. We can see that older users differs very little from the majority, between 12 and 16 pages per visit. But the skip link users have far fewer page views, just 6.

A graph showing the page views per visit for different groups of users.

If we look at the time spent per visit, we see the same trend. We also see that older users spend a little more time than the rest, which may be expected.

Here are some buttons for different features for editing your own ads, who turned out to not be accessible by keyboard. Instead of fixing this bug right away, I made an A/B-test to measure if the fixed version got more clicks than the buggy one.

Screenshot from a web page with 4 buttons. 3 of them are marked with a cross because they are inaccessible.

But I couldn’t see any improvement in number of clicks in the fixed version. Could it be that the users who couldn’t use them simply drowned in the masses?

A table showing that there is little difference in click ratio between inaccessible and accessible buttons.

Then I ran the same report with skip link users only, and here we can see a remarkable change. Don’t look at the one the top. That was accessible to begin with. But look at the second one. Here the click rate per visit increased from 19 percent to 29, and the gaps for the other ones are even bigger.

A table that shows that the difference in click ratio is bigger when you only look at users who see the skip links.

This shows us how web analytics can help us make our web site accessible, if we are able to find these users in the statistics. If we had been measuring these buttons from the start, and had some kind of alert when the gap between user groups became too big, then we would have discovered this bug right away.

Whether or not you are able to study this group isolated, you should still use web analytics for what it’s worth. The next time you find yourself in a debate if a feature should be accessible or not, implement both designs and run an AB-test. And break the myth once and for all that universal design doesn’t appeal to all users.

This is what I had. I hope I have brought some light on the dilemma regarding using web analytics on people with disabilities. I know there are different opinions whether we should do this at all, and I won’t tell you what to think. This isn’t rocket science, so it’s possible for everyone to do. But if you do, do it responsibly.

--

--