Screen reader testing on Lexis Advance: JAWS and NVDA. Illustration: LexisNexis Content UX Design Team

Using Video Simulations to Identify Pain Points Three reasons why I use video similations to help blind customers

Min Xiong
LexisNexis Design
Published in
8 min readNov 11, 2018

--

Twelve months ago, I was appointed to lead an accessibility improvement initiative with the mandate to get quality feedback from customers who need accessibility assistance, then leverage these opportunities to directly impact product quality and engagement. I have always been passionate about inclusive design and accessibility, so I was very delighted to take on this challenge and set out on my journey.

As the project proceeded, we worked on customer enquiries for VPATs (Voluntary Product Accessibility Template) and accessible books, colour contrast remediation, internal auditing reports, policy update, staff training as well as helping screen reader users directly. Among these key activities, identifying screen reader issues is the most challenging but also the most rewarding one.

The best approach I have found to improve product quality and help customers directly is to use video simulations to identify customer pain points. Why? Three reasons:

The concept of video simulation to identify customer pain points. Illustration: LexisNexis Content UX Design Team

1. It provides simple and clear direction

After conversations with several Agile teams, I found that commitment to accessibility and usability is an essential aspect of the current product development cycle. We test each release and assemble VPATs for major releases to continuously monitor and identify targets for improvement. For any accessibility defects which affect the quality of our products, the intent is to remediate them as soon as they are found. I was very pleased with my findings as it means most people are aware of accessibility guidelines when design, build and deploy the product. The consensus is that accessibility improvement should be part of Agile process.

However, during the research, I also realised that not everyone understands how screen readers work nor how serious an issue can be for the non-sighted customers. Quite often, it takes time to communicate a defect to all stakeholders. It requires an extra explanation or guidance as to why the defect is a problem for the customers, why we need a correct tabbing order, why a tab and a button are totally different. It is painful to see a long list of accessibility defects sitting in the queue and moving slowly to get resolved.

At this point, I told myself we need something which is much more powerful and much quicker to digest and understand. After brainstorming with my team, we came up with the idea of using video simulation and started producing short and concise videos to demo customer pain points to development teams. For example, we gave a US Agile team advice to search filter panel section last month. Instead of waiting for QA to test the feature, we produced a 1-minute video which demoed the process as how a potential customer who is visually impaired would use this section. The file delivered a simple and clear direction to the stakeholders. It showed how the screen reader would process the information, what problems customers were facing, and required no additional time from the team to install a screen reader nor learn how to use it.

Lexis Advance Search Filter Panel testing: Date Picker. Illustration: LexisNexis Content UX Design Team

2. It creates a shared understanding of issues

I have never worked on this video simulation approach before, so it is totally trial and error for my team and myself. After applying this methodology for three months, surprisingly I quickly identified another big benefit of using it. It helped with troubleshooting a communication issue or shall I say, the “telephone” problem. Here is what happened.

A customer raised an issue regarding the new version of JAWs screen reader making our products completely unusable. The information was initially passed to Account management, then to Product and Legal. After a short impasse, it passed down to the QA team for re-testing. The new report from QA simply stated that no issues can be recreated, which meant we were stuck with the enquiry and this became a puzzle for people who were in the conversation.

Since we are a UX team who provides accessibility advice to stakeholders, I started getting involved and thinking carefully about what to do. At first stage, I instructed my team to re-create the scenarios using various browsers and screen readers and had them recorded in videos. Based on the video comparison, the results were more complicated than we anticipated, and browsers behaved randomly with various JAWS versions. At this point, we were still not sure if the problems were related to JAWS versioning or our product, but I was glad to have videos to prove that the issue required further investigation.

After gathering all the data, we thought the best approach would be to interview the customer and record her user journey directly, so we contacted the customer and arranged a video call. During the session, we asked her to work through a few scenarios and found out that the actual issues are far more complicated than a screen reader update. It covers a product feature she didn’t like (a feature most sighted users love), a new product feature which should work for her but has not been assigned to her account (a decision from A/B testing team), a product feature she couldn’t use (an oversight from the development team), also several defects from an old feature which we plan to replace next year.

To summarise, the video simulation first helped confirm that there were divergent perspectives about what the issues could be with JAWS and our product. Then the end user interview really helped us clarify the information, as the issue the customer raised couldn’t be understood or demystified with a simple phone call. It required work from the UX team to get to the bottom of it. After the user interview, we re-tested everything she raised again and supported her concerns. Now, we have all the evidence and data needed to talk to the internal stakeholders as well as the external customer. Everybody involved in this “telephone” problem has the video comparison results. It makes it easier to plan next steps in relation to the needs and interests of our customers.

Keep calm and collaborate — the concept of baking accessibility into products on a global scale. Illustration: LexisNexis Content UX Design Team

3. It drives stakeholder empathy with non-sighted users

As I mentioned previously, our QA team provides VPATs to aid the quality control of our product. The document allows our company to provide a comprehensive analysis of our conformance to accessibility standards set by Section 508 of the Rehabilitation Act. I truly believe that this is a fantastic approach we should continue to use. On top of that, I also think having stakeholders go through the same experience as the non-sighted customers would be very interesting and would have a much powerful impact on them.

To give you an example, we have a section called Client Menu, which allows the users to enter a list of client names. A lawyer will most likely charge clients on hourly basis, so it is critical to know how long they spend on each client and how much they charge them. I studied our QA reports — this section has never been flagged as a defect. However, a visually impaired customer raised an issue and claimed that this feature is unusable.

OK, so what is the problem? First, we quickly tested the code using browser-based accessibility testing tools and no errors were found. Later we checked with QA again, the report still indicated the screen reader was reading all the information. This really baffled us, so we thought the best approach would be to act as a user and test the workflow. As a result, we discovered that the screen reader indeed read all fields but it read all client names with the same reading text. For software testers, the screen reader read information as expected, so they ticked the box as passed, as they were not aware what the content it is supposed to be reading and what would make sense to the customer.

To make the issue clearer, we re-created the scenario and recorded it in an video along with our own analysis, and then asked all stakeholders to watch it and go through the same workflow as the customer. Everybody is typically busy and there are many competing priorities. By using video simulation, they were able to watch the file with little need to ask for additional information. It led them to experience the same level of frustration as the non-sighted users had experienced. The impact was significant as the stakeholders fully understood the implication of this and therefore became more willingly to prioritise the remediation effort. As we say, the video speaks for itself.

Think like a customer improve your product. Illustration: LexisNexis Content UX Design Team

There are many forms of customer interviews. No doubt, meeting customers face to face is the best way to observe their user journey and learn some new valuable insights. Unfortunate, this is not always feasible for a global business. The sample cases above were long distance video calls for customers who rely on screen readers to use our products.

If you are a UX designer or researcher who is keen on improving accessibility for your products, you will find using videos very beneficial like my team and I did. If you find this post useful, please remember to give us some claps! I also listed some tips below if you need to conduct a video interview to understand your non-sighted customers better.

Tips and Tricks — discover ways to get the most out of video simulation. Illustration: LexisNexis Content UX Design Team
  1. Choose a tool works for you:

Choose a tool based on your budget, what is available and what is accessible. Below are the tools that we used in capturing/recording our screens in our usability testing.

  • Snagit (good for short but high-quality recordings)
  • Skype for business (good for long sessions or ad-hoc scenario analysis)
  • Zoom (this is my favorite as it records screen reader sounds from the computer as well as from users’ microphone; it has audio transcript option to automatically transcribe the audio of a session)
  • WebEx (it is a well-known tool, and has a large user base)

2. Understand your users

Interviewing a non-sighted user is different from working with a sighted-user via the internet. A screen reader is a form of assistive technology which allows non-sighted user to convey what sighted users can see on a display. Since reading the buttons, icons and keyboard input takes time, you need to be patient with your users and allow yourselves sufficient time to complete the session.

3. Re-purpose the video to achieve your goal

If you need to conduct a full analysis on the session, keep the entire video as a reference. If you need to discuss with stakeholders, it is best to cut the file into small sections and focus on specific topics. You can use the clips to prompt discussion, simulate recall or provide a basis for reflection. You can even narrate a sequence of video or select a sample for detailed discussion and ask others to comment.

4. Protect your customer privacy

Please remember you must obtain customer consent before starting any video recording. Also, since customers are non-sighted, they are not aware of how much information they are sharing on their desktop. It is important to blur personal data or sensitive information if you need to share the files with external stakeholders.

--

--

Min Xiong
LexisNexis Design

Global Head of Content UX at LexisNexis, enjoys traveling, reading, and passionate about inclusive design, innovation and accessibility