How to Become an XR Illusionist

Emilie Joly
Jun 4, 2018 · 9 min read

Breaking Down the Magic of Interaction Design and User Experience in XR using SpatialStories.

This is a writeup of my keynote at AWE 2018

I’m the CEO of apelab by day and have been an Interaction Designer & Technologist for 8 years now. My focus is to open the XR creation process to anyone interested in building interactive applications and connect enteprises & institutions with XR creators worldwide through our XR Platform. Today we will take a quick look SpatialStories, its core features and dive deeper into XR interaction design principles.

  • A series of Open Tools dedicated to XR which include a Unity Plugin dedicated to Interaction Design, an API for developers, and a Standalone Application called Studio that we’ve created in collaboration with HTC as part of their Vive X Program.
  • A Work Platform built on the blockchain which will allow XR creators to easily access jobs and challenges directly within a headset, inside the Unity game engine and on the Web 3.0.

Here are some images of our prototype for the XR Work Platform to launch later this year running on the Ethereum network. Curated enterprises & institutions can submit challenges and jobs on the platform. Creators can start building seamlessly within a game engine or a headset and winning XR Makers get rewarded with our Story Tokens. Story Tokens can then be exchanged to ask for new features on our Github, Virtual Trophies, Hidden Goodies, New Assets and Components or simply exchanged as payement for the great XR work done.

Home Page featuring a series of XR Challenges and their associated prize.
View of the XR Challenges directly accessible via the Unity Game Engine. While SmartContracts take care of the rest.

Our platform will be available by the end of the year, featuring best-in-class challenges and gigs from top institutions & companies around the world.

Introduction to XR User Experience

Let’s now move towards the topic of user experience, so you are prepared to become the best XR Creator. To start of, I’d like to make a simple distinction between two types of applications:

1/ Creation Apps, VR/AR tools

These types of applications rely on the ability of the user to build something, to create an experience from scratch with his/her knowledge.

2/ Experiences, Games, Consumer Utilities

These types of applications require no or very little creation from the end user. He just plays the game or uses the service provided.

We’re going to focus during this talk, on the first category.

It’s amazing to see that XR is spreading in so many industries. Education, training, entertainment, retail, healthcare etc. This really changes who our users are going to be; so when we talk about user experience, design, interactivity, we really need to take into account who are end users are.

If we take a quick look at the types of users who are going to become both makers & consumers of XR content, we’re talking about teachers, students, scientists, doctors, researchers, marketing leaders, brand managers, storytellers, employees etc..

We’re not necessarly talking about software developers nor video gamers. So there are some key elements we need to think about when designing what their experience is going to be like.

The body truly becomes the interface. Using the user’s eyes, hands, position in space, in some cases his brain, or his facial expressions. These are tools you can play around with as an interaction designer. But at the same time it should not be too overwhelming or disorienting as you are litteraly responsible for someones body.

Feedbacks in XR are linked to you senses. They are key to any experience as they help your user understand what is going on, and if he is doing something right or wrong. Feedbacks need to be simple and relevant. Always linked to an action your user is performing. Whether it is looking at something, grabbing an object, pointing or moving towards it. Usually a combination of descreet visual, 3D audio, and haptics feedbacks makes for a good experience. Be careful though, it’s always possible for the user to miss those feedbacks if for instance something is happening on his hands but he is at that moment looking elsewhere and it will end-up confusing him/her. Try to be very selective on why this feedback is happening and make sure that the user can see or feel it clearly.

Instructions differ from feedbacks as they are here to guid your user on his journey. I see intructions being under-designed in most immersive applications, resulting in a strong frustration from the end user. One might think that because this virtual spatialized environment looks like reality, it will be instantaniously natural. Believe me, it is not. On the contrary, we have to take into account why is it still a bit unatural or not so obvious for users not used to navigating these new media.

Oculus Touch and Vive Controllers

Let’s take a look at those controllers. Normal people won’t understand them right away. They’ll have a hard time learning and remembering the controls and it won’t feel like a natural extension of their hands. These controllers don’t feel natural for new-comers. But there are some tricks you can use to make you users more comfortable and help them forget the technology in their hands by using what I call Suspension of Disbelief.

Suspension of disbelief is the ability for an individual to believe or feel that something is real when they know it really is not. It’s a common term used to describe our state of mind when watching stage magicians perform. And we can find parrallel with this notion and XR Interaction Design. As an example, let’s take a quick look at the Wii Controllers:

Nintendo has always been great at mastering suspention of disbelief. If you remember the old Wii; the Wii-motes simply had an accelerometer and yet they would make you believe you’re play tennis, baseball, or even dance. But in reality, most of the time, all you had to do was to sit on your couch and wave your controller loosely to win. But no one cared, everyone just plays along because it’s just more fun and natural.

And this is a notion you can use throughout your XR design process.

Interaction Doodles

Whatever you are creating, there is always a smart and simple way to make believe something is real. These doodles represent some of the brainstorms we sometimes do to think about simple interactions that are super-effective and trick the body. The elevator for example is amazing: depending on how close the walls are, the effect is uncanny. It’s extremely simple but very effective. Another good example is the work done by a group of students on a workshop last week. Their idea: a political dictator speech simulator on the HTC Vive. On the right they had a prompter with the speech, on the left hand side they had a series of gestures the user must do at the right time, and in the middle a virtual microphone. And that simple idea of a microphone, as easy as it was, was so effective, it made the whole experience feel real as you start speaking an hear your voice come-out of the speakers with a very large echo.

Keep this in mind as you think about your designs. You can play around with a lot of things which will help you user feel immersed and less confused.

Standard User Rig

We’re now going to take a quick look at some core concepts of how SpatialStories works. In the Unity Plugin we try to give you access to all of the inputs that are available for you to play around with. You have the proximity with the user, the hands, the inputs, the head, sound and the possibility to use the Vive Trackers. Facial expressions for the iPhone X only and well at one point we will be able to add Brain Computer Interfaces.

Simply right-click in Unity on the camera and transform it into an interactive camera. And you are good to go.

We also have the notion of interactive objects. Any 3D model can become interactive. If it’s an animated model you can launch its animations easily to build out your experience.

An interactive object also has its own structure: proximity zone, snap points if you grab the object, etc. An Interaction is built our of conditions and actions. Conditions are linked to what the user is doing and conditions are linked to the Interactive Object himself. For example, when I look at the Bird and Press the right trigger, the bird flies. We also have a dependency system allowing you to link interactions with one another.

Here is a shor-list list of all the actions and conditions that are available and you can also add the ones that you need using custom actions and conditions scripts.

We’ve been working with HTC Vive on a standalone VR creation tool over the last few months which will also be translated to Mixed Reality when the hardware is a bit more advanced. Below are three short videos showing you some of the key features and interfaces of Studio

Here you can see how we handle both controllers: the right hand is for changing modes using bubbles and the left hand is for the menu. There is a play and edit mode from which you can switch out instantaneously. Then to move around you kind of crawl like you are in a swimming pool.

Studio Controller Interface

Here you can see someone working on a visit of an appartement and check-out how it’s possible to quickly edit, move objects and teleport around the space. Changing the lighting and checking out the result directly.

In this last video you’ll see how we can add interactions to any static 3D objects. Through a partnership with Sketchfab and their new API, those assets created by amazing artists, have been taken directly out of the Sketchfab library and imported in Studio. Some of them even have animations built in. In this case, I am building a very short adventure where there is a robot, a keycard and a control panel. My little game is: the user must place the keycard on the panel to wake up the robot, then if he throws a Juice Box at him, the robot falls. Take a look at the video below to see what the process is:

Interaction Workflow

Here are also some of the design workflows built by the team so you can see how complexe it is to build an interface for two controllers and a multitude of menus. Months of interface design and iterations have been made by the team to build this first prototype and probably many changes to come in the future.

Another interesting bit of tool that we’ve created to help the interface designers to build the different menus easily is a simple XML system so that can simply populate a full interface with built in functionalities. So, add things like buttons, object selections, etc. All of this is going to be made available on our open platform.

This wraps it up for me, I hope this was informative and useful. If you want to get access to SpatialStories, just register on our website and join our exiting community of XR Makers !

If you want to try some really cool game made with SpatialStories, head to the Oculus Store and check-out Break a Leg.

Thanks and enjoy the rest of the conference !


Shaping the future of VR/AR/MR content creation

Emilie Joly

Written by

XR Technologist and Interaction Designer, Blockchain & Co-Founder@apelab



Shaping the future of VR/AR/MR content creation