Design Research in Mixed Reality

Bhavna Patel
TEAGUE Labs
Published in
4 min readJul 7, 2017

At TEAGUE, we like biking and not getting hit by cars. A while back, we found a $5 doppler radar sensor and developed ‘Reardar, a prototype device that warns cyclist of oncoming traffic. Sensing cars from a bike is one thing, but through our testing we found a more critical question: What is the best way to alert a cyclist to cars coming up from behind. Or, going even further:

What is the best way to alert and inform a cyclist on the move?

First, we built a couple of physical prototypes to test which notifications work for cyclists in a real-life environment with bustling city noise, road vibration/potholes, dense city traffic, bright sun and dark shadows. For each test, the participant rode along the busiest streets of downtown Seattle while the mechanism vibrated, beeped, spoke, or flashed colors. As we were still working out the kinks of the radar technology, we manually triggered notifications.

Physical prototypes worn around different parts of the body

We were casting a wide net here with various experiments with the goal to narrow down the list of mediums that are the most effective in a wide range of circumstances. By the end of these early tests we knew the types of audio, haptic, and visual mechanisms that could be an effective means to communicate ‘information’ to a cyclist on the move:

Selected mediums of notification

We had worked out which notifications worked on the move, but we needed a way to convey information more complex than a simple alert. We also took this opportunity to move to a safer environment. We have a bit of experience with using Unity and custom interfaces. We decided to build a VR (Virtual Reality) / IRL (In Real Life) test platform using Unity to drive the VR in order to enhance our real-world testing insights.

Micro-controller connected to different sensors and Unity.

Not only did this give us a safer test environment but it also let us leverage previous VR environments, assets, and scenarios we had built. The researchers had the ability to be present and observe the participant throughout the entire (simulated) experience.

We were assuming a near future setting that allows us some practical hypotheses. We did not concern ourselves with the future particulars such as vehicle-to-vehicle communication and connectivity. We defined 5 scenarios/alerts to be conveyed using 3 different mediums (Audio, Haptic & Visual) through our prototype.

The VR prototype setup

· Unsafe to change lanes (safety alert)

· Obstruction ahead (safety alert)

· Danger: behind (safety alert)

· Text message (phone alert)

· Turn left (navigation)

The notifications were triggered by both the cyclist’s decisions and traffic in the VR world.

The Teensy micro-controller measures real-world signals (bike pedaling cadence, brake on/off, handlebar rotation angle) and passes those on to Unity using Firmata. All of the experiments were controlled via a phone communicating with the micro-controller via Bluetooth. This let the researcher operate all test sequences via a simple app.

We walked each participant through the scenarios in a randomized sequence. Each scenario was repeated using all 4 modes of notification. The participants rode through our virtual scenarios and we extracted feedback about the efficacy and distraction levels of the notifications on the move.

In general, participants preferred minimal cues for “simpler” messages, like a car approaching from behind, while more “complex” messages, such as navigation, were better communicated via deeper cues like verbal announcements.

None of the communication mediums won out over the others, but the idea of a simple medium combined with sensing technology has merit. Elevating a user’s senses for increased productivity or safety is a promising area for future exploration. Professions such as construction work, fire fighting, or even professional sports would benefit greatly from increased situational awareness in their respective dynamic environments.

All of these explorations are helping us to better understand human behavior with and around new technologies. We have been leveraging VR to explore what future city traffic situations will look like and how humans will behave in them for over a year. It’s a fascinating realm; we look forward to sharing more of our explorations soon!

--

--