I am a Senior Accessibility Engineer with VMware. I learned what I know about accessibility by testing for, fixing, or trying to fix web accessibility problems. Luckily, I have had some great mentors and a supportive accessibility community along the way. I’ve tested consumer and enterprise applications and have seen the challenges that come from design concepts, inaccessible implementations and other issues. My goal is to ensure our software is born accessible by addressing these challenges.
7:00 AM. Getting ready to go to work
Although I can often work from home, today I am going to commute. While getting ready I’m thinking random thoughts, many about the Slack channels I monitor, the defects I track, or ARIA and accessible web development.
ARIA stands for Accessible Rich Internet Applications. It is a set of attributes that can be added to HTML to provide name, state and role information in implementations that do not convey this information programmatically. ARIA also provides guidance on keyboard accessibility commonly found widgets such as disclosures, accordions and grids.
Before I leave home I think about complicated accessibility issues, frequently about ARIA, and how to best solve usability problems for people with disabilities without contravening recommended ARIA usage guidelines.
Looking at the Clarity Design site, I noticed my name under the Contributor’s heading and felt proud. I woke up my wife early to show her. I wrote the Accessibility Conformance Report (ACR) for Clarity 2.2 after a round of gap analysis and implementation of my proposed solutions. An ACR is a document that describes a product’s conformance with accessibility standards and guidelines. I am proud of the progress we have made. But we we still have more work to do and more ACRs to write.
Ok, time to get in car and start commuting. Stuck in traffic, I notice that the van in front of me is covered in icons that represent the different services they provide as a company. The icons remind me of many web pages and apps that use icons to represent features or menu items. However, when icons don’t have visible textual equivalents, users with disabilities or even users without a disability probably won’t understand what the icons mean. Luckily, the van in front of me did have textual equivalents below each icon to help people understand what they mean.
9:00 AM. Providing Accessibility Feedback
I answer questions from product designers, developers and other stakeholders via Slack, email and in person. The questions are usually about how to make a component or interaction more accessible and context specific responses are most valuable. When applicable, I use the question and answer to populate the VMware Confluence Accessibility recommendations page so we can answer future variations of the question faster and also allow users to find the answer themselves.
Answering these questions can involve research and prototyping. For example a Clarity developer recently questioned the usage and support for an ARIA property. So I set up a demo and tested that property to determine if we can use it in a Clarity component and found that it is supported in all the browser and screen reader combinations we support except for Microsoft Edge and JAWS 2019. These findings made a great addition to the VMware Confluence Accessibility recommendations page.
12:00 PM Lunch and hanging out
The VMware campus cafeterias have amazing food. I love the Mexican food here and grab a yummy burrito to eat with one of the designers and a developer I’ve made friends with. It’s fun to eat, talk about cats, music and other non-work stuff for a bit before my next meeting.
1:00 PM. Meeting about ephemeral notifications
Next, I’m in meeting about ephemeral notifications with a product designer. I’m presented with an interaction where content and controls are added to the UI and then disappear on their own. One observation I’ve made of enterprise applications is that something new is always popping up without being initiated by the user. If the newly added content is text, it can be implemented with a polite live region or the status role but when it contains controls such as a link to an action that needs to be taken quickly, then a modal implementation would be my first choice because the focus management and content hiding it would offer would help make the reading experience for screen reader users more logical. But I’m required to offer a non-modal solution. Fortunately, we agreed putting these notifications somewhere else on the page after they appear and letting screen reader users know where to find them, rather than just disappearing.
2:00 PM. Testing
Now onto some testing. We have one design system which is Clarity and nearly 100 products at different levels of adoption of Clarity. I’m often tasked with testing one of these products and writing a VPAT for them. I enjoy testing, and take pride in writing defects with both steps to reproduce and remediation advice. I’m happy when the developer understands the issue.
3:00 PM. Clarity Office Hours
Testing is now interrupted for Clarity office hours. Here I am available to take all questions in person. A developer demonstrates a geographical map based filtering feature. My accessibility centric perspective causes me to think many features are just eye candy. After seeing this feature and determining that it is basically a filter, I recommend placing an ARIA tree next to it and hiding the map from screen reader and keyboard-only users and one of the product designers agrees. I really enjoyed this collaboration because up until this point in my accessibility career I’ve been working with developers only. It’s great to get to discuss accessibility issues before a design is implemented.
4:00 PM. Preparing to give a talk about ARIA
After office hours and testing, I also have to prepare for an upcoming internal VMware talk I’m giving on ARIA and accessible web development. These topics can’t be covered in a 90 minute talk so I’ve structured the talk around a handful of topics that will serve as a starting point. I’m also working on an accessible drag and drop solution for cases where blind screen reader users need to find the drop zone on their own. I’ve tried a coordinate based approached with a live region(an ARIA technique to interrupt a screen reader user’s reading experience with status updates), which is too verbose to be usable. It’s a fun effort that I hope will help clarify the accessibility challenges related to drag and drop to designers and developers at my company.
One can’t ask for a more fulfilling work day than this. I’m glad I am fortunate enough to earn a living making VMware’s applications more accessible to users with disabilities.
The VMware Design team is looking for talented designers, engineers, and leaders to help us continue transforming enterprise design with accessibility in mind. Check out our open positions!