Mixed Reality Design Toolkit (Part 1 of 5)

developed by Adelle Lin Sara Camnasio Sharon Lee Shuling Chen

HYPER-REALITY by Keiichi Matsuda.

In the past few years, there has been an emergence of technologies allowing users to experience virtual environments in increasingly immersive and realistic ways: “Augmented Reality,” where users see virtual elements superimposed to the camera image of their smartphones; and “Mixed Reality” where users can interact with holograms that are superimposed onto the real world around them.

Augmenting the physical world offers greater depth than standard interactions. As technology continues to improve the accessibility and ubiquitousness of these platforms, we are blurring the line between virtual and real.

Through this short 5-part series we share with you our insights gathered from designing and developing an application for Mixed Reality (MR), and specifically the Microsoft HoloLens. Through our design principles and the introduction of a Mixed Reality Design Assessment Tool, we hope to guide designers in determining the minimum User Experience complexity required for their project. Our principles and frameworks were tested through a specific case study of designing an app for art curators.


While developing our product, we realized the strength of Mixed Reality over Virtual Reality: Mixed Reality is still grounded within the real world and can incorporate the physical environment into the virtual experience. How could we strengthen this relationship between virtual and real? We explored three territories.


Metaphors are used often in technology: they are called skeuomorphisms and they can create intuitiveness and familiarity. In MR, because it consists of both virtual and real elements, skeuomorphisms and metaphors are reinvented. This new medium allows us to use our embodied memory to understand metaphors that would otherwise have to be more literal in a completely virtual environment. A few examples

  • The “trashcan” symbol for deleting assets and files in a technological context can be replaced by anything that mimics a similar gesture or movement (throwing something, discarding something)
  • The “Desktop” as a start screen and waypoint is very familiar to all computer users. In mixed reality, we would have a starter scene that serves a similar purpose in that the user can navigate to other scenes and features with ease and familiarity of a home base.
Our app’s representation for artist information menu items.

We can create metaphors to take full advantage of the ability for the user to interact with not only flat, 2D icons or graphics, but also 3D objects and 3D space via voice, gesture or body movement. In MR, we can use metaphors in more abstract ways and skeuomorphic design is redefined.


Our app’s concept of the body activation zone interaction

Exploring the possibilities and opportunities found in MR, we built and implemented a concept into our own app called Body Activation. We believe it can be of use in all projects concerning mixed reality, as well as virtual reality. Instead of only depending on a user’s gestures or voice as a source of their input, Body Activation is an input type that considers a user’s movement, placement, and gaze in the context of the environment and hologram layout. This is the method of having physically located zones initiate unique events or functions by either a user physically stepping into one or gazing upon one.

When introducing body activation, we have to consider the full context of the user: how does the user normally interact with space? How can they use their bodies? Who are they with?

In terms of activation of the space, we then considered the environment that the user would be interacting with. What is the user’s attention being drawn to? What would be a surprise element that could delight the user? What information is being revealed? How long would they want to spend in the space?

With body activation, we are encouraging users to move around the designed space. We want the users to explore, anticipating feedback based their own movement. We can also use body activation as a device to guide users to specific routes, if we even want to constrain the user’s free roam exploration, as well.

Our app’s visualization of a guided experience through body activation. Note user input involves air tap, gaze, and motion, as well as viewing angle and context.


Waypoints allow the users to check in with the system and navigate it. They act as a reference point and a safe place. An example of a good Waypoint in a complex navigation system are Departure and Arrival boards at the airport. Similarly in our product, if a user feels lost or confused during the experience, they can use a Waypoint to rediscover their position or navigate to a different feature. Essentially it is a map that the user can use for guidance if at any moment he/she feels lost — a reassuring “You are here.” For our Waypoints, again we considered how we could make the digital environment more akin to the digital by using skeuomorphic design. The Waypoints became physical landmarks, presenting an opportunity for developers to provide breadcrumbs and system navigation.

With these concepts in mind, we then dove into how their implementation could be used to manage the direction of the user experience of our product. In assessing the user experience needs, we specifically focused on design of context, environment, and system in mixed reality. In the next article, we will discuss in detail some of these design principles through the MR Design Assessment Tool.