Creating Virtual and Augmented Reality Experiences with Machine Learning + UX (MLUX)

Bob Stark
Machine Learning and UX
5 min readFeb 11, 2020
Photo from MLUX x ARVR Academy event with Liv Erickson and Katie Hughes.

Virtual and augmented reality are new fields with very exciting futures. Earlier, virtual worlds mostly meant Second Life and first-person shooter video games. Nowadays, there are virtual reality devices like the Oculus Rift and mobile augmented reality games like Pokemon Go. To help us navigate this new landscape, Liv Erickson and Katie Hughes from ARVR Academy shared some of their expertise and insights with us!

Empathize with your users

Because the hardware and environments will be used by a global audience, considering the needs unique to each different user is particularly important. There are few known best practices at this point, but here are some places to start:

Don’t make your users uncomfortable

Stereoscopic 3d is known to cause nausea in some people, so be mindful that this is a big risk for some of your users. Also, realize that they have likely never used a VR device before. Therefore, help them with the use and maintenance (e.g., cleaning) of the device.

Remember users come in all shapes and sizes

Not everybody in VR is a white, straight, 5’9”, middle-aged man. Different people have different desires in how to represent themselves in a virtual space and different needs for inputs and outputs.

Users may sometimes want to have their avatar be a visual representation of what they look like in real life, but other times may not want that for a variety of reasons. Furthermore, if voice communication is used rather than or in addition to text, they may or may not want to use their voice, either.

“In social VR, no one knows you’re actually a carrot” (taken in Mozilla Hubs)
“What we thought the user was seeing” vs “what the user was actually seeing” (from @beastpets)

Users may also have difficulties or different desires for interacting with the environment. For example, people of different sizes and heights will experience objects in a virtual space differently because of the device’s real-time calibration.

Furthermore, disabled people will have different needs for input into VR and AR (e.g., if they can’t use their hands), and output (e.g., if they can’t see or hear well or at all). Therefore, enable accessibility and make sure to test on as many and as diverse a set of people as possible, early and often.

Draw inspiration from real life

Virtual environments are intended to mimic some or all aspects of real life. Therefore, when applicable, consider using other modalities than just visual, including haptics and audio. This will communicate information more naturally and increase immersion in the environment. For example, basing a virtual environment on haptics and sounds of a real environment enables the user to feel how something feels in real life (e.g., the heaviness of a rock) and how something sounds in real life (e.g., the sounds of animals in a forest).

The VR game called Moss is a good example of using haptic and audio feedback to create a realistic and immersive forest environment

Just get started!

Thanks to a variety of tools and available resources, creating VR environments and models is becoming as simple as editing a photo. You don’t need to know how to code or even how to create 3d VR models in a modeling toolkit. In fact, many skills from other fields are useful in the VR and AR fields. For example, user experience methodology in VR is very similar to other fields, even though there is new hardware and a 360 degree environment.

To get started, try out as much VR as possible. There are a lot of different technologies and experiences that you may want to contribute to. Also, utilize resources both online and offline to get acquainted with how to create those experiences. Online, there are many tutorials on sites like Youtube, while offline there are hackathons, classes, and workshops being held around the world. No matter what resources you find and end up using, they will help you meet other members of those communities and supporters to aid you in your journey.

Create the future of VR and AR!

With your newfound skills and connections, you now have the power to enhance social interactions and bring people novel experiences around the world! Social interactions can be enhanced by providing a shared space or shared reality among social groups, and by enabling the communication of personal information about themselves in a safe way. Novel experiences can be created and shared with new hardware and machine learning technologies like brain-computer interfaces to react to users’ moods, gaze tracking for environment navigation, hand tracking for object manipulation in the virtual environment, and voice input for interacting with intelligent agents interacting with users.

Big thank you to our panelists and sponsors for sharing their expertise with us!

Liv Erickson: Senior Product Manager, Mozilla Mixed Reality and co-founder of ARVR Academy

Katie Hughes: Product Designer at Facebook and alum of ARVR Academy

Thank you to Microsoft and ARVR Academy for helping us sponsor the event and hosting it in the Microsoft Reactor space!

About the Machine Learning and User Experience (“MLUX”) Meetup

We’re excited about creating a future of human-centered smart products, and we believe the first step to doing this is to connect UX and Data Science/Machine Learning folks to get together and learn from each other at regular meetups, tech talks, panels, and events. We also have a resource list of our favorite MLUX examples on Medium!

Interested to learn more? Join our meetup, be the first in the know about our events by joining our mailing list (February 2020 newsletter), watch past events on our youtube channel, and follow us on twitter (@mluxmeetup) and Linkedin.

--

--