Immersive Experiences: Working with XR Technologies
For the last 3 weeks, the students at CIID were busy learning about designing for immersive experiences, a course taught by Joshua Walton and James Tichenor. Josh and James work at the intersection of the physical and digital. This year, they focused the class on the evolving field of Emerging Realities (or *Mixed Reality*, the more popular term). Mixed Reality, as opposed to Virtual Reality, merges the digital and physical spaces to create a combination of new spaces, objects and artefacts.
Simply being exposed to the breadth and depth of XR and the various limitations (ethical, technological, etc) was perhaps the most exciting aspect of the class for us. As Interaction Designers, we’re more often than not limited to designing for screens. XR requires an entirely different kind of thinking.
This article expands on our experience of designing for XR, and discusses our thoughts about the medium.
Another Leap Forward
Though people have been developing VR/XR/AR technologies for quite some time, only in the past few years have developers at large had access to these tools to build and explore. Both Google and Apple invest in this technology, and Apple just released their ARKit 2 for developers. Now that the technology is in place, how do we use it? How can designers make meaningful experiences? What makes an experience immersive, and how do we go about designing such experiences?
Exploration and Ideation
During the ideation process of our three week course, we quickly realized that most of our ideas and interests focused on moments of “enhanced interactions.” These were interactions where the physical space communicated more than what was physically present. A bit confusing, right? Here’s an example: A user stands in a location, let’s say Times Square in New York City, absorbing the sights — people walking, talking, crying and laughing. Towering digital billboards and massive storefronts for Levi’s and Disney surround her. For a few minutes she’s listens intently to these present sounds, but then she gets transported. She sees all these physical things that exist in 2018, but the soundscape is of 2010, then 2000, 1970, 1945, 1917. What does it mean to have a retro time-lapse in sound while the physical present never changes? How would this change her understanding of 2018 in NYC?
In the vein of designing mixed realities for soundscapes, isn’t it interesting how the conversations around XR almost exclusively revolve around enhancing visual experiences? What about designing for other senses? For example, writing on a digital screen through a stylus is far less satisfying than writing on a piece of paper. Can one design an XR experience that simulates the touch and feel of paper or writing accoutrements?
Shared realities was another space we were drawn to. What happens when the physical and digital worlds merge to create a new reality? We wanted to explore an interaction where a digital object affects a physical object, which then affects the digital object, so on and so forth. Shared realities ended up being the space we wanted to explore the most, and we spent two weeks developing a concept and prototype.
Concept and Scenario
For our prototype, we built a simple bowling game that started in the digital domain but ultimately translated into the physical domain. We wanted to understand people’s reactions to such an experience, and were quite excited to build it!
The concept was simple — use an iPad to flick a bowling ball that rolls toward and finally (hopefully) knocks digital bowling pins. The user would then see the physical pins knocked over. Our surprise and delight feature was a physical object that could be placed in the direct path of the physical bowling pins. When the user flicked the digital bowling ball, she’d be blocked from knocking out any pins.With our concept finalized, we were excited to build this experience with the new tools presented to us
Build an immersive experience
One of the first things we learned was that building an XR experience requires a lot of XR tools, none of which we were familiar with. We used a software program called Unity3d to build our digital artefacts. Unity allows importing of image trackers to simulate an XR experience on a digital screen. Though the program is powerful, the learning curve is steep. There’s lots of new vocabulary, and basic knowledge of C# is required. That being said, Unity documentation is excellent. If it’s your first time using it, we’re sure you’ll be a knowledgeable user in no time.
When the VR bowling ball hits the VR bowling pins, a Raspberry Pi receives these events and actuates solenoids. We constructed a stage and made small bowling pins that sit on top of these solenoids. When the solenoids actuate, the physical pins knock over.
Connecting the VR with the Physical
Raspberry Pi and Unity connect through a platform called Spacebrew. Spacebrew lets two entities communicate with each other. We could make Unity send information to the Raspberry Pi even though the code base is completely different (Unity code is in C# and Raspberry Pi code is in Python).
We demo’d our first prototype and received insightful feedback. Some of our most important questions were answered — was it too simple a concept? Too boring? Is the game idea exciting and if so, why?
Our classmates definitely wanted more from the experience. They wanted sounds when there was a gutter ball, or when pins knocked over — in general a more traditional bowling-like experience. The feedback took us back emotionally to our ideation phase, where we were so excited about the importance of designing for sensory experiences other than visuals.
ITERATION AND FINAL PROJECT
Based on our learnings from the demo, we iterated once more, adding a few more bowling pins, and a physical cube which, when placed in the path of the ball, could obstruct it from knocking out pins.
The course definitely invited us to ask a few basic questions about XR. For example, it would be interesting to see how XR changes the idea or category of everyday objects. What happens when an object is both physical and digital? Is there a scenario in which any physical object can be affected by the digital version? Moreover, what is the context of use? Everyday life experiences like someone learning to drive, or scenarios where the impact is far greater like training soldiers in combat environments?
Finally, what would be the ethics of a mixed reality world, where the digital world affects the physical world?
This course certainly made us feel like we were working on something really new, and there aren’t many references. The uncertainty is fun, exciting and full of potential. As designers, we hope to be able to use these tools to create new experiences and paradigms for people to augment and enhance their lives.