Don’t fence me in: Unblocking accessibility issues with customer feedback

Learn how OpenShift and PatternFly use customer communication in their mission to support UX in which all users can roam free.

Joe Caiani
PatternFly
8 min readDec 21, 2020

--

Cowritten by Joe Caiani and Jessie Huff

Photo by Jessica Da Rosa on Unsplash
PatternFly’s branded divider, our logo centered between two lighter lines.

“Oh, give me land, lots of land under starry skies above

Don’t fence me in…”

— Cole Porter

Roadblocks in our daily workflow? We’ll pass.

Unplanned delays, context-switch whiplash, crossed fingers, and looming frustration are all unwelcome symptoms of encountering unforeseen blocks in our day-to-day. Putting a blocked task aside to tackle new ones leaves us with a trail of lingering follow-ups and hopes that, at some point, we’ll return to find it unblocked. This process is a surefire ticket for mental exhaustion — and that doesn’t even get into the consequences reaped from leaving essential tasks incomplete.

For Red Hat OpenShift users, blocked tasks mean blocked software deployments, potentially fencing in a user’s entire workflow.

So when we heard a user was being blocked from task completion in OpenShift, it certainly caught our attention. How did this accessibility issue slip through our usability testing?

We’re Joe Caiani and Jessie Huff, Principal UX Developer and Front End Developer on Red Hat’s User Experience Design team, and this is our component comeback story.

PatternFly’s branded divider, our logo centered between two lighter lines.

Part 1: Joe investigates an incoming bug describing blocking behavior in OpenShift’s context switcher

In the software world, a blocking bug will halt a release’s entire deployment, meaning stakes were high for a particular user who reported blocking behavior when trying to switch between Administrator and Developer contexts in OpenShift. The bug details revealed the bug impacted a non sighted user working with a screen reader. Because I’m a UX developer on Red Hat’s UXD team, I’d been exposed to screen readers and was aware that our components undergo some testing for them. Since OpenShift uses a mix of in-product and PatternFly components, I investigated the component in question to verify its identity and origin. I found the blocks occurred within a PatternFly dropdown component.

Our PatternFly team is passionate about their mission to create components that are accessible for all users, so I knew they’d be eager to solve this accessibility issue. In order to do so, we needed more information — I followed up with our customer to learn his screen reader type, version, and browser: JAWS 2019 on Google Chrome. Then, I looped in UXD accessibility experts so that we could investigate, discuss, and tackle the blocking issue together. That’s where Jessie comes in.

Part 2: Jessie leads a PatternFly investigation into dropdown component functionality and solutions

When Joe first mentioned that a customer was having difficulty with using the dropdown in a screen reader, I was a bit surprised. Recently, we’d examined the console with an accessibility lens, and I recalled that it worked great with VO and NVDA. To determine next steps, we needed more information from our customer: Which screen reader were they using, and with which browser? Often, it’s easy to assume that all screen readers will interact the same way, but it’s crucial to test each one due to the variability between them.

JAWS on Chrome 2019 — ah, well that likely explained it. From my anecdotal experience, testing in JAWS can be cumbersome, so it isn’t tested as frequently as other readers like VO or NVDA. The JAWS-testing odds were stacked against us: Our development team works on Macs incompatible with JAWS, and JAWS itself isn’t open source — they require a license. Without a license, tests can only occur in 40-minute windows, meaning JAWS is much more difficult to test in practice. My test flow involves two screens: I display my work on my work computer while running tests on my personal one. To begin a new JAWS testing session, I need to throw my work into a preview platform like Surge on my work machine, then run over to my personal one, restart it, and launch the JAWS application — all before I can even look at the test’s corresponding pull request (PR) review. JAWS testing requires you to restart your computer each round to ration 40-minute testing periods without the option to set content up in advance, a frustrating mechanism that makes accessibility testing, well… fairly inaccessible.

Despite these hurdles, I was able to set up a JAWS testing environment to reproduce our customer’s blocking issue. I opened the OpenShift cluster with my JAWS testing software on Chrome, and voila, I experienced the console exactly how the customer did.. He wanted to switch from the Administrator view to the Developer view to complete the rest of his tasks. WhenJAWS navigated to the dropdown, it announced that the dropdown was present, open, and interactive with the up and down arrow keys — but pressing said keys didn’t shift focus, and the menu items weren’t announced. Like our customer, I was effectively blocked. Definitely an accessibility bug!

A video recording of our JAWS dropdown testing, in which we recreated our customer’s block.

As I mentioned before, variability between screen readers and browsers makes testing them and talking to customers even more important. For JAWS users, their selected browser heavily influences how markup is interpreted.

Freedom Scientific, the creator of JAWS, underscores this pivotal relationship between browser type and JAWS ARIA support and interpretation:

“Most browsers support some type of accessibility API (application Programming Interface), and assistive technologies use the API to get information about what is presented on the screen. So, ARIA markup is transformed by the browser into information that fits the accessibility API it supports. Then, the information provided by the API is processed by the assistive technology. That means JAWS support of ARIA depends heavily on the browser being used.”

We observed this browser influence in action. When we tested our dropdown component in other browsers like Firefox, it seemed to function, which pinpointed that its dysfunction in Chrome stemmed from a strange interaction between JAWS and that specific browser.

Part 3: Jessie and PatternFly recode and retest potential accessibility solutions

With the origin of our customer’s issue found, it was time to test potential fixes. Our team tested a number of different ideas including:

Altering the added aria with aria-haspopup and aria-controls to communicate functionality to JAWS

  • Following different example structures
  • Adding delays to wait for JAWS to register the dropdown menu
  • Using aria-live in hopes it would tell JAWS the menu was getting rendered

All without success. Our dropdown component uses custom keyboard interaction, which made hunting for a solution all the more challenging. We exhausted a wide array of avenues, from standard guidelines to more unconventional approaches. Brainstorming played a key role in our search for a viable way to fix this accessibility snag.

One particular brainstorming session inspired the solution that carried us home: Adding an event listener after click, allowing JAWS to interact with the dropdown after it’s opened while also allowing users to switch between their mouse and keyboard.

A video demonstrating PatternFly’s fix to the previously blocked dropdown component.

I nearly jumped out of my chair when I finally heard the oh-so-charming JAWS voice announce my dropdown items. I was excited and relieved to finally have a successful fix.

Part 4: Jessie, Joe, and the PatternFly team reconnect with the user

Next, we needed to verify our fix with our OpenShift customer to ensure it addressed all of his concerns. I reached back out to Joe to set up a meeting with the customer and asked one of my coworkers, Jenn, to try it in JAWS as well so we could get another pair of eyes on our solution.

Our customer was incredibly helpful and agreed to meet with us to evaluate our work in an OpenShift context.

In our meeting, we walked through his thoughts and expectations about the dropdown component, and confirmed that we had in fact fixed his issue. (Hold on for a minute while I get my pom-poms and start cheering.)

Snoopy dances and cheers, holding two cheerleading pom-poms. Our accessiblity-solution happy dance!
Our accessibility-solution happy dance.

Along with confirming the success of our solution, this meeting also surfaced another fix we needed to tackle to strengthen dropdown functionality for JAWS users.

During our solution walkthrough, we noticed that the user wasn’t repeating the disabled items in the dropdown as he navigated through it.

After interviewing him about the disabled items, he confirmed that his expectation was that these were indeed not announced by JAWS but should be announced as disabled or unavailable. This discovery inspired yet another PR solution allowing the disabled items to receive focus and therefore be announced by JAWS.

The best part about all of these accessibility fixes is their reach — they don’t just impact and improve OpenShift; they improve the accessibility of all products that use the PatternFly dropdown component, too!

This customer’s experience inspired even more PatternFly improvements as we continue to test with JAWS, following a similar pattern with the options menu and application launcher components. One short conversation with our customer extended into accessibility improvements across the board.

Part 4: The PatternFly and OpenShift teams unblock all instances of the dropdown issue

After the two PR’s were merged in PatternFly, we updated our dependencies in OpenShift to pick up the accessibility code. In a few simple clicks, all instances of the PatternFly dropdown in OpenShift were cleared of any JAWS-based roadblocks! Since PatternFly does provide a lot of accessibility research and testing already, we’ll use this experience to elevate that base by using customer-centered interactions to detect, decode, and solve similar issues.

Part 5: PatternFly and OpenShift will repeat this process in the future, for everyone’s benefit!

Our experience with OpenShift’s dropdown is just one example of accessibility issues that can sneak past initial accessibility testing. We’ve boiled our experience down into six key steps you can use to reach beyond usability testing to build more accessible and inclusive UX. We like to call them the 6 R’s:

1. Research

Inspect your user’s problem, bug details, and work environment. Ask questions to dig deeper into your user’s specific experiences.

2. Recreate

Simulate your user’s workflow and task goals in the same environment to uncover the source of their block.

3. Recode

Experiment with ways to solve the accessibility issue and improve similar UX scenarios across your product.

4. Retest

Run your solutions through in-depth testing to reflect their success.

5. Reconnect

Touch base with your user to verify your solution addresses their needs.

6. Repeat

Unblock all instances of the issue by updating code dependencies, guidelines, and more. Keep channels of communication open so that customer feedback and guide similar accessibility fixes in the future.

We hope our OpenShift and PatternFly story inspires you to follow a similar customer-focused approach to build and support more inclusive product experiences for users of all backgrounds.

When it comes to building accessible product experiences, no one deserves to be fenced in.

“Let me ride through the wide-open country that I love

Don’t fence me in…”

For more accessibility information and testing guidance for accessibility issues, see PatternFly’s accessibility guide.

PatternFly’s branded divider, our logo centered between two lighter lines.

Have a story of your own? Write with us! Our community thrives on diverse voices — let’s hear yours.

--

--