Designing Human Interactions for Mixed Reality

How might you interact with computing over the next 1–3 years?

This post provides some early observations based upon my experience and learnings from Coursera’s Interaction Design Specialization with design wizard Scott Klemmer.

Why Design & Spatial Computing?

Like many wonderful things, our story begins with Elon Musk and his SpaceX video on the “future of design” from 2013 which I stumbled across late one night in 2016 on youtube.

I believe we’re on the verge of a major breakthrough in design and manufacturing. To be able to take the the concept of something from your mind, translate that into a 3d object really intuitively on the computer and then take that virtual 3d object and be able to make it real just by printing it… 
So it’s going to revolutionize design and manufacturing in the 21st century”
— Elon Musk

Wow! Could we really be headed towards a new era of design, manufacturing and computing?

Google’s Tango, Apple’s AR project, Microsoft’s Hololens, Magic Leap, and other augmented and mixed reality platforms like DAQRI, all lead me to one hypothesis…

Yes — we are heading to a new world of spatial computing…

Of course, I didn’t have the faintest clue as to what this really meant for me or for design. All I knew was that the wonderful worlds of Tony Stark and Star Wars amazingly, seemed somehow within reach. And I wanted in.

So I decided to sign up for Scott’s Interaction Design Specialization course. It was fun. And soon enough, I found myself thinking “What should I do for my Capstone Project?”

How My Capstone Idea Evolved Over Time

In a nutshell, it went something like this:

  • Inspiration: “Mixed Reality looks awesome… how do you build a droid?!”
  • The Brief: “Hmm Glance could work. What would a new dashboard for MR look like? How might that work?”
  • Prototype_v1: “I suppose I’d want to be able to launch projects, content/information and assistants into my world to help me get stuff done… Perhaps it’ll be similar to how we currently engage with computing on screens…”
Wireframes for initial Glance / Dashboard MR application on iPhone using Camera.
  • Feedback on V1: “what is it?”, “what am I supposed to do?”. Hmm we are awfully used to screens. I’m trying to simulate a future design interaction using tools from the old world…
  • Prototype_v2: “Maybe we can wing it… let’s add a background and tell them the task is to launch a new experiment wizard in mixed reality”
V2: added a background to give an AR/MR feeling + the task of launching a new experiment.
  • Feedback on V2: “Ok so I need to create a new Experiment… I guess that should be in Launch?” said one of the users from The key lesson? The design of dashboard was confusing. An app that launches a new experiment shouldn’t be in ‘My Projects’ but instead, a more appropriate name like for example “Launch” > “New Experiment” > which launches a new science droid into your world that helps you do great science.
  • V3: “Ok… the naming was confusing and the static / response tool of MarvelApp does not effectively simulate a true MR interaction. So let’s go back to basics with a super simple interface that might make more sense in a MR world…”
Redesigned MR interface build around the concept of a ‘Launcher’ (Spotlight for MR)


  • Most application design is really a hurry up to show me something (the weather, Game of Thrones, etc) and then wait for the human to do something.
  • Mixed reality or spatial/ambient computing is different because it’s centered around continuous interactive loops.
  • The tools we use to prototype iPhone apps and existing web apps, aren’t geared to Wizard of Oz prototyping in mixed reality. For this new era, we will need a new generation of design and creation tools.
Show your support

Clapping shows how much you appreciated 3K’s story.