Designing for Mixed Reality on the Hololens

The future of work?

This week, over 350 participants from around the world gathered at the MIT Media Lab for the Reality, Virtually Hackathon.

Organized into teams of 3–5 people, we created games, professional tools, story-telling experiences, and medical technology. From teaching refugee children to speak English, and improving the memory of people suffering from Autism, to playing frisbee golf on an asteroid belt in outer space, mixed reality has the potential to open new avenues across so many fields.

Initial Reactions to Mixed Reality

Eugene Chung, of the VR-focused Penrose Studios, predicts that VR + AR will converge as an industry, with AR eventually taking the lead. After spending the weekend immersed in augmented reality though, I was struck by how many design considerations differed between the two today.

Bodhi rocking the Hololens

The initial sensation of putting on the Hololens is beautiful. It’s lighter, and less jarring than a VR headset. Since you see the room around you, interacting with a hologram in actual space feels almost natural (unlike putting on a Rift or a Vive).

Everyday applications of the Hololens are compelling— but its limitations are evident.

  1. This thing is designed for the indoors. Infrared light doesn’t play well outside, the social implications of walking around with a Hololens on your face aside.
  2. Battery life is limited. An experience needs to consider a 2–3 hour battery life if it requires a user be un-tethered.
  3. Arm fatigue is real. I envisioned throwing on a headset and waving my arms around like a conductor for hours. In practice, you’re pretty tired after 10 minutes and glad to lean on voice command (more on that later).
  4. The field of view — woah. I also envisioned putting on a Hololens and swimming in holographic magic. In reality, you put on the headset, and look through a 16:9 window into a field of holographic magic. Rumor has it this will expand soon, but for now, it’s one of the biggest design considerations to keep in mind.
From the Moments team on Medium

One of the most exciting thing about working in augmented and mixed reality is the opportunity to influence best practices and design patterns. That said, if you’re considering a mixed reality application, mentors from UCLA, Microsoft, and Harvard had critical insights to save you time in design and development.

The team that I joined set out to explore how mixed reality could be a tool for self-improvement. We worked on a focus application that guided you throughout a task to improve your productivity.

In the design field, I feel we’re more prepared with every new technology to learn and formalize good design faster — very hopeful to see it happen here. Here are a few things I learned about designing in a world of holograms.

Awareness of Real Space — The Room

Spacial awareness is a fascinating frontier in digital design! Before putting pen to paper, ask the following:

  1. Is the user sitting or standing?
  2. What’s their proximity to the task at hand?
  3. What does the room look like? How’s a user’s mobility? How much room do they have to navigate?
  4. Am I going to intercept windows or glossy surfaces? The Hololens can’t map those.

This was so fun to work with in designing for AR. You aren’t just playing with simulation — you’re interacting with virtual elements in the physical world and have both in your tool kit.

Consider using physical devices like QR-code equipped blocks as a metaphor for anything from a controller to moving walls in a video game. With spacial mapping in the Hololens, you can pin objects to a surface and drop them to the lowest plane (ie. set them on the floor or a table).

This is especially useful when you realize the limitations of gesture control.

You aren’t just playing with simulation — you’re interacting with virtual elements in the physical world and have both in your tool kit.

Spacial Awareness: Holographic Space

The fact that everything exists in real space means you can walk behind things you place around the room.

I was super weirded out that floating application screens were blue dead space when I walked behind them, a good reminder to consider all dimensions in the design phase.

The field of view may tempt designers to generate ‘screen’ experiences — but despite the small window, some of the more magical moments were things like looking up at the head of a brontosaurus towering over you. Design for the whole room!

Placing applications in Hololens.

That said, you can lose holograms. After ‘placing’ them around the room, you can turn away and lose track. It’s like having too many tabs open on a browser! Consider how a user will clear an activity.

Diana Ford of the UCLA game lab suggested ‘body locking’ content instead of ‘display locking’ holograms in an application. Took me a while to figure out what that meant. It’s the difference between opening a recipe over the stove while cooking and leaving it there when you go to the dinner table and opening a recipe over the stove while cooking and it following you to the dinner table at the same distance you spawned it. She recommended ‘body locking.’ We used display locking, and I can see how you’d want to avoid leaving a floating recipe in the kitchen and forgetting about it.

Ratios you’re working with in code are real world coordinates. Ie. a distance of 1 in Unity = 1 meter away from the user.

The best place to engage a user is between 1–5 meters in front of their face (and therefore the camera in the headset). The fact that the field of view is limited means a few things:

  • Discovery is important. Spawn things in the optimal zone so a user can assess the object before extending it elsewhere.
  • A user can lose things in space by turning around.
  • Objects can get cut off by the field of view. The scale of an object can be confusing if it’s cut off on more than 2 sides.

Finally, I would never have thoughts about spacial sound — but it’s critical. If you walk away from an audio source, the volume should get quieter. How cool is that?!

Gaze, Gesture and Embracing Voice Command

Moving objects in mixed reality is a fascinating bit of interaction design ripe for innovation.

Your primary superpower is your gaze — a cursor at the center of your line of sight is akin to a cursor on a screen. Grab something with a tap of your finger, and you can swing it around the room with a twist of the head.

The ready state for hand recognition

Manipulating objects is much harder in AR than VR. Moving an object forward? You can only go as far as your arm can extend past the object’s current point.

Everything you gain with a controller while using a virtual reality headset is gone. Sizing objects in Hololens, for example, requires manipulating a bounding box with taps and head movements.

The Hololens only recognizes your hand in three initial states:

  1. Ready state (one finger up).
  2. Pressed state.
  3. Bloom, a motion that opens the control menu.

After selecting an object, you can pinch to hold — then either navigate through space, or navigate on an axis prescribed by the application (ie. to move a slider).

We found ourselves pining for a Leap Motion the whole time, which I don’t think plays well with Hololens — yet. I found the interface revolves too much at the moment around the practice of ‘clicking buttons’ in what should be a whole new paradigm of interaction.

Which brings me to voice command.

Voice, and the ability to hold a discussion with our technological tools, is a coming tsunami of interaction. It was exciting to put it to use in virtual space. We got tight with Cortana this weekend, and it turns out it’s quite easy to turn a user’s voice into a text command the Hololens can react to.

Embrace voice commands. Gesture control is hard to develop, and voice input is an effortless way for a user to perform a task with the headset on.

Overall, augmented reality is promising — and I can’t wait to design for it. The Reality, Virtually Hackathon, held at one of the world’s best playgrounds for new ideas in the Media Lab, was the kind of exhausting, non-stop creative whirlwind that leaves you buzzing.

The organizers did a wonderful job of, above all, gathering people from all walks of life, industries and experience levels, and I can’t wait to see more from them and the tools we used.

Team IdealsAR, powered by Soylent®

--

--

--

Urban Planning Master’s Candidate at the Harvard Graduate School of Design. Previously UX Designer @ City of Boston + Sonos.

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

6 Web Design Trends that are Changing the Look of Marketing

FÜD Inventory System — 2

How to build buy-in with users and validate product decisions during COVID-19

UX strategy described in layperson’s terms

Life Lessons in Usability Testing

Black-and-white image of person with Macbook laptop on their lap, using the laptop’s trackpad.

Maximizing your Modals: A Brief Look at One of Usability Hub’s Feedback Forms

The Designer’s Practice — Exercises to Enhance your Creativity

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Liz Cormack

Liz Cormack

Urban Planning Master’s Candidate at the Harvard Graduate School of Design. Previously UX Designer @ City of Boston + Sonos.

More from Medium

Healthy Fit:A virtual reality fitness application that allows users to work out and gain knowledge…

Design shorts: What is Gamification

Web3 UX — Error & Scam Prevention

Why Brands are moving away from Traditional product photography to Photorealistic CGI