Designing for Mixed Reality on the Hololens

The future of work?

I spent the last 4 days immersed in the future potential and awe-inspiring present of mixed reality at the Reality, Virtually Hackathon at the MIT Media Lab.

It was huge — over 350 participants organized into teams of 3–5 people created games, professional tools, story-telling experiences, and medical technology to do everything from teach refugee children to speak English, improve the memory of people suffering from Autism and play frisbee golf on an asteroid belt in outer space.

Cause why not!

Initial Reactions to Mixed Reality

Since I’d never had my hands on a Hololens, I decided to focus on uncovering what I didn’t know about the design, prototyping and development of an experience in mixed reality, as opposed to a completely virtual environment.

Eugene Chung, of the VR-focused Penrose Studios, predicts that VR + AR will converge as an industry, with AR eventually taking the lead. After spending the weekend immersed in augmented reality though, I was struck by how many design considerations differed between the two.

Bodhi rocking the Hololens

The initial sensation of putting on the Hololens was incredible. It’s lighter, it’s less jarring than a VR headset since you see the room around you, and interacting with a hologram in actual space is unreal. Unlike putting on a Rift or a Vive, it felt normal almost within minutes.

Everyday applications of the Hololens are pretty clear — and so are its current limitations. The Microsoft team pointed out some useful guidelines that highlighted the current state of the hardware:

  1. This thing is designed for the indoors. Infrared light doesn’t play well outside, the social implications of walking around with a Hololens on your face aside.
  2. Battery life is limited. An experience needs to consider a 2–3 hour battery life if it requires a user be untethered.
  3. Arm fatigue is real. I envisioned throwing on a headset and waving my arms around like a conductor for hours. In practice, you’re pretty tired after 10 minutes and glad to lean on voice command (more on that later).
  4. The field of view — woah. I also envisioned putting on a Hololens and swimming in holographic magic. In reality, you put on the headset, and look through a 16:9 window into a field of holographic magic. Rumor has it this will expand soon, but for now, it’s one of the biggest design considerations to keep in mind.
From the Moments team on Medium

By the far the most exciting thing about working in augmented and mixed reality is the lack of best practices. It’s a book waiting to be written.

That said, mentors from UCLA, Microsoft, Harvard etc had critical insights that saved us a bunch of time over the weekend. In the design field, I feel we’re more prepared with every new technology to learn and formalize good design faster — very hopeful to see it happen here.

Our team set out to explore how mixed reality could be a tool for self-improvement. We worked on a focus application that guided you throughout a task, improving your productivity.

Here are a few things I learned about designing in a world of holograms.


Awareness of Real Space — The Room

Spacial awareness is a fascinating frontier in digital design! Before putting pen to paper, ask the following:

  1. Is the user sitting or standing?
  2. What’s their proximity to the task at hand?
  3. What does the room look like? How’s a user’s mobility? How much room do they have to navigate?
  4. Am I going to intercept windows or glossy surfaces? The Hololens can’t map those.

This was so fun to work with in designing for AR. You aren’t just playing with simulation — you’re interacting with virtual elements in the physical world and have both in your tool kit.

Consider using physical devices like QR-code equipped blocks as a metaphor for anything from a controller to moving walls in a video game. With spacial mapping in the Hololens, you can pin objects to a surface and drop them to the lowest plane (ie. set them on the floor or a table).

This is especially useful when you realize the limitations of gesture control.

You aren’t just playing with simulation — you’re interacting with virtual elements in the physical world and have both in your tool kit.

Spacial Awareness: Holographic Space

The fact that everything exists in real space means you can walk behind things you place around the room.

I was super weirded out that floating application screens were blue dead space when I walked behind them, a good reminder to consider all dimensions in the design phase.

The field of view may tempt designers to generate ‘screen’ experiences — but despite the small window, some of the more magical moments were things like looking up at the head of a brontosaurus towering over you. Design for the whole room!

Placing applications in Hololens.

That said, you can lose holograms. After ‘placing’ them around the room, you can turn away and lose track. It’s like having too many tabs open on a browser! Consider how a user will clear an activity.

Diana Ford of the UCLA game lab suggested ‘body locking’ content instead of ‘display locking’ holograms in an application. Took me a while to figure out what that meant. It’s the difference between opening a recipe over the stove while cooking and leaving it there when you go to the dinner table and opening a recipe over the stove while cooking and it following you to the dinner table at the same distance you spawned it. She recommended ‘body locking.’ We used display locking, and I can see how you’d want to avoid leaving a floating recipe in the kitchen and forgetting about it.

Ratios you’re working with in code are real world coordinates. Ie. a distance of 1 in Unity = 1 meter away from the user.

The best place to engage a user is between 1–5 meters in front of their face (and therefore the camera in the headset). The fact that the field of view is limited means a few things:

  • Discovery is important. Spawn things in the optimal zone so a user can assess the object before extending it elsewhere.
  • A user can lose things in space by turning around.
  • Objects can get cut off by the field of view. The scale of an object can be confusing if it’s cut off on more than 2 sides.

Finally, I would never have thoughts about spacial sound — but it’s critical. If you walk away from an audio source, the volume should get quieter. How cool is that?!

Gaze, Gesture and Embracing Voice Command

Moving objects in mixed reality is a fascinating bit of interaction design ripe for innovation.

Your primary superpower is your gaze — a cursor at the center of your line of sight is akin to a cursor on a screen. Grab something with a tap of your finger, and you can swing it around the room with a twist of the head.

The ready state for hand recognition

Manipulating objects is much harder in AR than VR. Moving an object forward? You can only go as far as your arm can extend past the object’s current point.

Everything you gain with a controller while using a virtual reality headset is gone. Sizing objects in Hololens, for example, requires manipulating a bounding box with taps and head movements.

The Hololens only recognizes your hand in three initial states:

  1. Ready state (one finger up).
  2. Pressed state.
  3. Bloom, a motion that opens the control menu.

After selecting an object, you can pinch to hold — then either navigate through space, or navigate on an axis prescribed by the application (ie. to move a slider).

We found ourselves pining for a Leap Motion the whole time, which I don’t think plays well with Hololens — yet. I found the interface revolves too much at the moment around the practice of ‘clicking buttons’ in what should be a whole new paradigm of interaction.

Which brings me to voice command.

Voice, and the ability to hold a discussion with our technological tools, is a coming tsunami of interaction. It was exciting to put it to use in virtual space. We got tight with Cortana this weekend, and it turns out it’s quite easy to turn a user’s voice into a text command the Hololens can react to.

Embrace voice commands. Gesture control is hard to develop, and voice input is an effortless way for a user to perform a task with the headset on.


Overall, augmented reality is a blast — and I can’t wait to design for it. The Reality, Virtually Hackathon, held at one of the world’s best playgrounds for new ideas in the Media Lab, was the kind of exhausting, non-stop creative whirlwind that leaves you buzzing.

The organizers did a wonderful job of, above all, gathering people from all walks of life, industries and experience levels, and I can’t wait to see more from them and the tools we used.

Team IdealsAR, powered by Soylent®