Gambling Commission Accessibility

What we’ve learnt from a recent accessibility audit of our new website

Read about how we’re incorporating accessibility into our website from the outset, our recent accessibility audit and how we’re learning from this.

Andy Jones
Gambling Commission

--

By Andy Williams-Jones
Service and Interaction Design Lead at the Gambling Commission
@aw_jones

Since the start of 2020, 6 January to be exact, a small team of people have been working away to design, build and populate a new website for the Gambling Commission, we’ve been documenting progress in our project updates.

Its currently at Private Beta and being tested with small groups of users and members of the public.

However, I wanted to use this post to talk about how we have worked to create an accessible website from the start and the outcome of the accessibility audit we had carried out by the Digital Accessibility Centre (DAC) in June.

Line drawings of people with different abilities
Image courtesy of the Inclusive design team at Microsoft Design

Our new website

We have built the new website completely in house and from scratch. It’s been built using Node.JS and makes use of the GOV Frontend framework.

On top of this framework, we have designed and built custom styles, templates and components. This lets us design pages and features that work for us and our users and still have the core GOV Frontend at our disposal.

We have avoided making any significant changes to the core GOV Frontend framework as this will help us upgrade easily and take advantage of future changes in the framework without significant impact on our website.

Where we had features or user interface (UI) elements that weren’t in the frontend but were things we wanted to provide users, we created our own components.

For example: here is our vertical navigation component which helps people to navigate our content in a guide.

Screenshot of a webpage showing our vertical navigation option and selected page content

The design process

When we identify that a custom component is required, I have a process for creating that component and I have a set of criteria which I follow when testing the component.

  • what is the need for a component?
  • what problem does it solve?
  • is there a way to meet the need with an existing component?
  • have we over-complicated the content/feature?
  • has this been done somewhere else?
  • can we use a component built by someone else?

If we establish that we need a custom component or we identify something built by another team, then we also ensure it meets technical requirements.

Technical requirements

Components must:

  • be tested with a representative range of users, including those with disabilities OR
  • be able to show that the component is clearly based on relevant user research from other organisations and best practice
  • conform to WCAG 2.1 AA as a minimum
  • not depend on colour alone to communicate information
  • handle where a user changes their default system colour or contrast
  • be possible to focus on the controls using the tab key
  • change in appearance when keyboard focus moves to it
  • be possible to activate the component using a keyboard
  • indicate when the mouse is hovered over it
  • be large enough to tap accurately with one finger
  • have a text label associated with them
  • have a minimum contrast ratio of 4.5:1 between foreground and its background
  • have valid, semantic HTML and not rely on using third party JavaScript frameworks if JS enabled, and work with JS disabled (progressive enhancement)

Testing

We have several types of testing processes when working on new services and our website, these include:

  • manual testing
  • automated testing
  • user testing
  • external audit

Manual testing

Throughout the development of a component, there is a lot of manual testing done using the standards as previously discussed. This includes testing on different devices too, Windows devices, Mac, tablet and phones.

While we design services and websites to work across all devices and our frameworks are generally orientated to mobile-first, we use data to determine if the focus on testing needs to be at a specific device. For example, we know some of our services are used exclusively on desktops (this isn’t to say this will change in future) so testing might focus on desktops more than mobiles.

In the example of our website, we have a 60/40 split to mobile devices so while we develop on laptops we will focus testing designs on mobile devices.

As most of the development of this website has been done on a Mac, the accessibility tools readily available on Macs have been fundamental in identifying some issues early on.

Using tools on the Mac to test various colorblind options.

Colourblindness is an issue that's important to me as I’m colourblind, I have mild to moderate deuteranomaly and protanomaly.

For me this means I have trouble with some shades of colour and specifically greens and blues.

So, to give normal-colour users an example of what this means, here is a section of our homepage which is filtered based on someone with deuteranopia, which is a significant red-green color blindness.

Example of a section of our homepage with a deuteranopia colour blindness

This isn’t a scientific test, some people will see this difference as it’s represented, others, like me see the beta banner as pinker and the blue of the header more purple. However, this shows why colour alone should never be used to indicate status.

You can try this yourself at https://www.toptal.com/designers/colorfilter/ Just type in a URL and select the type of colour blindness and it will render the site as a user with that condition might see it.

Our Beta banner uses a red/orange colour, for someone with colourblindness it might look green or brown or a shade of reddy pink (in my case). This could be very confusing if the colour alone was used to convey status.

Having tools to test contrast ratio, colour blindness and contrast are all important when designing components with colour.

Find out more about the different types of colourblindness at the National Eye Institute.

Automated testing

We use a tool called Pa11y to incorporate a level of automated testing in our website design and development process. This runs daily and also when triggered manually, to identify failures across pages and tracks changes over time.

This tool isn’t a silver bullet and can result in false-positives and can be misinterpreted by people unfamiliar with the tool and what it’s reporting, so use dashboards with caution and ensure people understand what they are looking at.

The report here shows that we have three errors, all triggered by Google reCAPTCHA implementation for forms. We’ve been reassured by testing that although errors are reported by this tool, they aren’t errors which cause problems for screen readers or other assistive tech users.

Warnings and notices refer to links and content on the page which should be manually checked. For example, are links descriptive enough, so check that content like “click here” aren’t being used for link text, images having appropriate alt text and so on.

User testing

We have a user researcher as part of the project who ensures that we are testing our website with a wide range of users. We’re a few rounds in of testing now.

We’ve tested with people from as far away as Sydney, with people using a wide range of devices and with a range of abilities and skills, knowledge and understanding of the gambling industry. This includes both people working in the industry, our own staff, and also members of the public.

This has helped inform some design decisions, identified issues to fix or features which we can improve, in some cases even remove.

User testing is an important critical step in the process to ensure that what we are designing and building works for our users. This is the first point where we’re talking to users and putting things in front of them to get feedback to help us iterate and make informed decisions on design.

We have learnt a lot in the past about accessibility and placement of content, this has fed into design decisions made in this project. Just because we learnt something in testing of a transactional service, doesn’t mean it can’t apply to our website.

External audit

Over the last three years we’ve worked with DAC to test our digital services with people who have a range of disabilities and use assistive technology in their day to day lives.

DAC aren’t a tick box for us, they aren’t there as a final sign off, they aren’t our confirmation that our website is 100% compliant with required standards.

DAC are there as another route for us to use to get a good cover of accessibility issues we may have missed through other testing methods.

Their thorough testing using a wide range of devices and tools and with the majority of the people working there having disabilities themselves, makes them ultimately the best people to test our website from a technical aspect, and help us better understand how not to repeat issues in future work.

By using DAC in the past, feedback and issues from previous services have fed through to this project and we’ve significantly improved understanding and knowledge and this led to our manual testing requirements.

Digital Accessibility Centre Audit

DAC carried out the audit in mid-June and tested all main parts of the website against WCAG 2.1 guidelines, they test to level AA and also extend some testing to AAA if required.

We had 18 issues and roughly split between technical issues and content-related issues. DAC were complimentary on the fairly minor issues found and were impressed at the initial surface checks to not have found any significant problems.

Unsurprising to me, many of the issues were related to screen reader use. This is an area which I will admit we need to improve testing on.

Summary chart showing issues and severity by type of ability and tool which is detailed in the following text

Cognitive, Low Vision, Dyslexia and Mobility (keyboard only)
Scored 2 — Completed independently but with minor issues.

Mobility (Voice activation)
Scored 3 — Completed independently, no issues.

Screen reader JAWS and NVDA and IOS
Scored 1 — Completed independently but with major issues.

Issues identified

The following issues make up all issues across A, AA and some AAA (as observations)

  • Videos not having an associated transcript or description
  • Video iframe missing a title to describe the video
  • Image in news list linked to the story along with the title
  • 1 link opens a file (link doesn’t indicate this)
  • Logo in header links to homepage but alt text didn’t indicate this
Comments from the tester about the alt text on the header logo
Example of how to resolve the issue
  • Decorative images using alt text
  • Typo on a radio input meant it wasn’t properly labelled and also meant an error message didn’t display properly
  • Jump in heading structure from H1 to H3 – news story title was H1 and a template header for Notes to editors was marked up as a H3 resulting in issues where story doesn’t contain headers.
  • One page contained a title that didn’t match the page heading
  • Contrast ratio on focus state on menu where the menu had been visited previously (visited focus state)
  • Focus state on buttons on hero headers not visible
  • Autocomplete turned off (we turned this off in a show and tell to demo the website and didn’t turn it back on)
  • Non descriptive label in newsletter sign up
Example of how labels on inputs may not be descriptive enough
Recommendation for resolving the problem
  • One link didn’t indicate it opened in a new tab
  • Reference to ABSG (another website of ours) not labelled as an abbreviation
  • Typo on a page heading
  • Use of chevrons to indicate paging can confuse users “< Previous page” and “Next page >”
This issue is one which we would likely have identified if we’d tested with a screen reader during the design stage
  • Cookie policy control had several issues.

Fixing the issues

As half the problems were content-related, the content team and I got together to go through all the findings and understand what we needed to do to fix them and prevent them from happening again.

Screenshot of our trello cards for the tasks to complete

We created a Trello list of all the things to resolve and assigned out the tasks to the person best placed to resolve the issue.

It took just one afternoon for four of us to fix all the issues — with the exception of the cookie policy control issue which is a bigger issue to resolve.

You’ll see that from the list of issues found that many are not major problems and fairly easy to fix.

I hold my hands up and I’m annoyed with myself for letting some of them slip through.

Things like typos, turning autocomplete off to do a show and tell and titles not matching headers — which I’m usually a stickler for checking had been missed and triggered issues which could have been avoided.

Cookie policy

This issue is a bit more complex to resolve.

I had done some research on this area and opted for a common cookie control service which is used on GOV sites and also a number of other public sector websites. However, despite assurances on the provider’s website that it is compliant, it has a number of issues.

After talking with DAC and the Cabinet Office, we’re opting to implement a policy control which has recently been developed by Cabinet Office and tested by DAC and achieved AAA compliance.

We’re now in the process of implementing this new cookie control component into the website.

Implementing a new cookie control policy

What we’ve learned

Be open with your team

Don’t hide issues from your team and the wider organisation.

It's by doing testing and having these kinds of audits that you improve understanding, awareness, knowledge and take this forward into future projects. It also helps spread awareness around your organisation when talking to people unfamiliar with issues around accessibility.

The first thing I did was bring the core team together to go through the findings and sharing the report with them. This meant that everyone knew what the problems were and we discussed how we prevent them happening again.

The report was also distributed to the wider team and our leadership team including our Executive Director for Data and Digital, we also discussed it at the regular project board the following day.

We can’t be complacent

We need to make sure testing with screen reader tools is the norm in the design stages and when doing manual testing on pages.

We also need to ensure that when using components built by 3rd parties, even other government teams, that we test them against our own criteria. We can’t just assume they are compliant.

New technical requirement:

Components must:

  • be tested with screen reader and voiceover tools on desktop and mobile devices

We need to pause and pay more attention when working on custom pages and when making changes that are temporary. Recognise that we have to put settings back and test the pages we work on more carefully.

I don’t want to resort to checklists, it should be the norm to check these and they don’t take long to do.

What we’d do differently

As I’ve said, some issues were simply things I’d done as a result of making temporary changes, typos or not taking enough care.

I’ve put in place a temporary process (a post-it note for now) for when I’m doing show and tells or demoing something or disable something (like autocomplete) that afterwards I reverse that so that it doesn’t get left in incorrectly. I’ll probably use Trello for managing this in future.

Our content designers are really good at ensuring they are descriptive so the content works well in search results both on our website and external search engines, we’ll just stop, take some time to check the non-template pages are correct.

Before the audit, we took on board recent GDS advice on links and tabs and have been improving link content across the new site, so we’re pleased to see very few issues with regards to external linking.

PDFs and videos

One link identified as linking to a file, was a PDF and the PDF itself was just two pages and wasn’t accessible. Rather than the suggestion to update the link, we converted the PDF content to HTML instead, so we’ve improved content and fixed the issue identified at the same time.

We have over 1500 PDF’s on the existing website, we can’t do this with all of them now, but over time we will migrate content where we can, or implement the required fix when we do link to PDFs.

Screenshot of the audit report highlighting an issue where we’re linking to a PDF directly.
Details of how to ensure users know what they are clicking on is taking them to a document.

Video

We know video needs to be described or have a transcript and two videos we were using (one from an external organisation) did not have these. We have now provided transcriptions for both videos and we’ll ensure that these are provided for all future videos used on the website.

Talk to us and get involved

Comment on this story or ask a question.

Read our project weeknotes where we discuss project updates and progress, you can also watch previous show and tells we’ve done.

Email us with any questions or feedback, we’re happy to share experiences and things we’ve learned.

Sign up to our user research programme and take part in user research activity on websites and services.

Digital Accessibility Centre

If you want to find out more about DAC, who they are and what they do, visit their website at https://digitalaccessibilitycentre.org

--

--

Andy Jones
Gambling Commission

Head of Design in Department for Education. Previously, Service and Interaction design lead at the Gambling Commission.