Toward a More Accessible Metaverse

Accessibility research for the metaverse has come a long way over the past few years, but it’s still a field full of possibility — and yet-to-be-answered questions. Here’s what we’ve learned from the toughest challenges we’ve faced so far.

Yao Ding
Meta Research
6 min readOct 25, 2022

--

By Sheila Meldrum and Yao Ding

Conducting accessibility research for the metaverse is an exciting new venture, but it’s one we’ve been working on since the launch of the first Meta Quest headset in 2019. In the past few years, we’ve learned that helping to make the metaverse a place for everyone is an undertaking just as complex and challenging as it is important.

The unique challenges of this work have fallen into three general categories: working on accessibility in hardware development settings (as opposed to software alone), designing and conducting virtual or augmented reality (VR/AR) research with participants who have disabilities, and doing research with an emerging technology. In all three cases, our most significant progress has come from confronting these difficulties head on.

Where we started

When we first began to address accessibility for VR (which preceded our AR products), we drew upon previous research by our central accessibility team, as well as general accessibility research in digital gaming (the most common use case at the time). This research gave us an idea of the rough percentage of people with disabilities that would be using our technology, along with some baseline accessibility features that people had come to expect. This was only a starting point, however, because we wanted to expand beyond gaming and enable people to use VR to have meaningful social, educational, work, and other experiences.

Our first studies were aimed at understanding how we could make the Meta Quest more accessible via the software. We focused on identifying accessibility blockers that were affecting large segments of people with disabilities and talking with them to understand how we might address those issues.

For example, we conducted a study with people who had mobility disabilities to identify a possible solution to a problem we’d identified with setting up the Guardian boundary in VR, a safety feature that lets people define the boundaries of their play area. This virtual boundary will help users stay within a cleared play area by appearing in VR when they’re near or touching its edge. We found that people who were stationary or seated while using VR couldn’t reset the floor height of their Guardian boundary without assistance, because doing so required them to physically touch the floor with their controllers.

Through our studies, we tested multiple design solutions and incorporated feedback from participants using wheelchairs, ultimately leading to our Adjust Height feature, which enables people to adjust their viewpoint while seated, expanding the range of VR experiences they can participate in.

Adjusting to the hardware development process

The product development process for hardware differs from the software design process in many ways, including the timing for locking in design aspects such as accessibility into the final product. Once a VR or AR hardware product gets past the beginning stages of development (prototypes) and moves into manufacturing, even the smallest design change can become prohibitively expensive or impossible to make.

In our first round of accessibility evaluations post-launch of our Meta Quest 2 headset, for example, we discovered that its external buttons were not accessible to some users with visual disabilities. The buttons didn’t differ in color or have raised tactile features that would sufficiently distinguish them from the body of the headset. At that point in the launch process, the design of the buttons could no longer be changed, and was instead added to the list of changes for future models. Those future headsets, we’re glad to report, will benefit from earlier accessibility research that will allow us to identify issues in time to make changes to the hardware.

Redesigning research

VR and AR research with participants who have disabilities requires us to design and conduct the research in more accommodating and flexible ways. One example is usability testing with Deaf participants who need sign language interpretation. When testing mobile or web products, Deaf participants can shift line of sight easily to watch either the interpreter or the task at hand. When testing with a VR headset, the participant will have to constantly take off the headset to see the interpreter, and put it back on to do the testing. We had to think outside the box. Either casting the video of the interpreter to VR, or mirroring a window to VR where we could type in instructions and questions in text format. In this way, the participants are able to complete the testing in situ without switching context.

As many AR and VR studies are carried out in our labs, we are committed to making our testing spaces inclusive and welcoming participants with disabilities. The value of all these efforts extends beyond just getting the research done, however. The process of identifying and addressing various needs in accessibility research has been a key part of understanding people’s actual product needs. Directly engaging with people with disabilities has also planted the seeds for creative solutions like the Adjust Height feature.

Bringing established learnings into new territory

Another challenge stems from the newness of the AR/VR product space, which is still in its infancy compared to other types of technology. That means we have to rely on some established findings that we know won’t apply perfectly — or, in some cases, at all — to the metaverse. Learnings from 2D in a given area often have to be rethought for 3D spaces.

Even well-established accessibility features like closed captions raise fundamental questions in VR and AR. For example: Where should captions be displayed in a 3D space? Should they be stationary, or follow the user’s head movement? When people are in social apps or situations with bystanders, should they be able to select who gets captions? Or should there be a simple radius limitation around them to protect the privacy of other users who aren’t interacting with them?

We also want to be mindful of making sure our features translate well to a 3D environment, such as giving people time to read and process information or move through space. For example, text captions in large social situations can easily become overwhelming when multiple people are talking at the same time in all directions and everyone has a virtual “caption bubble”. How might we design VR captions that don’t create visual clutter and cognitive overload? It’s a new consideration in 3D design.

While conducting research for emerging technology like AR/VR can present its own unique challenges, it is also the most gratifying. We’re excited to be setting accessibility standards for VR and AR that will be replicated and built on by others in the future, which brings us to the current state and goals of our accessibility research for the metaverse.

What’s next

The past few years have yielded valuable insights not only on accessibility in the metaverse, but also on our own processes and how they can better serve this important work. We’re just scratching the surface of how our products can enhance people’s everyday lives. In addition to becoming more proactive in building a better UX for people with disabilities, we’re starting to identify unique benefits that VR and AR can offer. We’re doing research to help us build hardware and software features that actually allow people to do things they wouldn’t otherwise experience.

Earlier this year, we hosted a design workshop to innovate on accessible experiences in the future metaverse. With our 5–10 year vision for the metaverse in mind, we examined questions like: How will people with disabilities experience communication, gaming, and self-care in the metaverse? How might we design the metaverse so that they’ll fully enjoy and benefit from it? It’s important to us to answer these questions so the metaverse is accessible for everyone.

Authors: Sheila Meldrum (primary author), UX Researcher at Meta;
Yao Ding, Accessibility Researcher at Meta

Contributor: Carolyn Wei, UX Researcher at Meta

Illustrations: Drew Bardana

--

--