Reading, Tricking and Shocking* Your Mind @ SIGGRAPH 2017

*Effects not understood

As technology evolves to improve upon the techniques of the past, discernable trends manifest at each year’s Emerging Technologies and newer VR Village exhibitions at SIGGRAPH, an annual gathering exploring the intersection of computer graphics and interactive technologies. This year is no different, with three main themes rising above the din: the improvement of AR and VR, refining applications of haptics, and tapping into the human nervous system. Focusing on the latter two — as the former deserves a write-up of its own — let’s examine the manipulation of the mind at perhaps the most invasive SIGGRAPH yet.

Reading Your Mind:

Neurable

Neurable’s custom EEG Vive headset

Let’s start with likely SIGGRAPH’s hottest ticket: Neurable’s brain-computer interface. Using a combination of eye tracking, EEG (electroencephalogram) sensors, and machine learning, the Cambridge-based company created an impressive demo that, in their words, allows guests to “click” with their mind. The demo consisted of three phases: calibration, training, and an escape mission.

  1. Calibration

I was told to focus on the one spinning object out of five in a circle of levitating objects. Then, as all the objects started to blink, I was told to think “grab” when that particular object blinked. Eventually, the system would read the object as selected and bring it towards me. This continued for each of the five objects.

2. Training

This is when the fun began. The five objects came up and began to blink, but this time, I got to pick the object I focused on. As any skeptic would do, I tried my best to only look at my selected object peripherally in an attempt to throw off the eye gaze component — impressively, the system worked every time, even when I was allowed to grab two objects at once. Overall, the team claims to have about an 85% accuracy rate when reading brain signals in real time or 99% when processed with a one-second delay.

3. The Mission

No more fun and games: time to apply what I learned. In an environment reminiscent of Portal meets Stranger Things, I was tasked with breaking out of a research facility. Unsure what to do first, I followed my training and grabbed something, unleashing a new targeting interface that allowed me to focus on a target and throw an object. Eventually, one of those objects broke a mirror behind me, revealing a code that I then entered into a keypad — with my mind. In the next room, I continued to throw things at robots and discovered a somewhat confusing ability to transform robots into harmless toys before teleporting to my final destination by focusing on outlines of myself in the distance.

While an impressive demo, I came out of the experience with strong operational concerns about the long calibration process. However, the CEO assured me this wouldn’t be an issue in the future. Why? They’re dumping all the data from the 400+ demos done at SIGGRAPH into their machine learning algorithm in hopes of training their systems enough to eliminate the calibration step altogether. Guest would still need training, of course, but this would allow a significant time savings.

Looking towards the future, the team mentioned they’re hoping to move beyond the “click” to scrolling and swiping, both of which could bring interesting new functionality. All in all, this is the most precise application of EEG devices I’ve seen, and I can’t wait to see how the system improves with all the data Neurable just collected.

STRATA: A Biometric VR Experience

Galvanic skin response and heart rate monitors on a custom armband

Using a combination of a breathing sensor, heart rate monitor, galvanic skin response sensor, and a Muse EEG headset, the experience purportedly rewards calmness with levitation through the multiple, meditative levels, controlling things like the pace of the music and environmental elements. I say purportedly because, as a guest on the last day of the conference, about half the sensors were no longer working.

As a counter to Neurable’s new platform, this showcases design within the realm of possibilities of the previous generation of consumer EEG devices and software, which allow developers to tap into mental states like anxiety and focus, with the potential for provoking an in-game response when a certain threshold is crossed. In its goal to promote calm, it succeeded, but unfortunately I can’t speak to the finer-grain controls afforded by the interfaces given their degraded state.

Tricking Your Mind:

HangerON & HangerOVER

Don’t try without adult supervision

This is an easy one to try at home: go to your closet and grab a wire or plastic hanger and pop it on your head so that it’s pressing against your temples. Feel that urge to face another direction? Researchers term this the “hanger reflex,” and they’ve taken this far beyond a silly headpiece into devices that can involuntarily redirect walking and force head movement in VR.

The devices are surprisingly simple. Using four airbags, the belt device puts pressure in front of the pelvis on one side and behind on the other, triggering that same reflex to turn. Done while walking, this causes the body to involuntarily twist to a new direction. The headband uses the same four airbags to trigger the same reflex , which the team synced with objects flying towards the guest in VR. It’s a bit of a strange combination since guests may want to turn away or blink given the visual stimulus, but I can imagine that experiences fighting this reflex or using it to control teammates could be a lot of fun.

(Left) My failed attempts at walking straight (Right) The headset device using four airbags

The best part: the team controls guests’ movements with a Super Nintendo controller.

Touch Hologram in Mid-Air

Photo courtesy Immersion SAS

Last year’s SIGGRAPH showcased a lot of novel haptics displays, but this year focused more on their applications. Two trends emerged: spherical displays and ultrasonic displays, with multiple teams using the UltraHaptics display or something quite similar.

All three applications of the ultrasonic displays created touchable “hologram,” one showing human lungs, one allowing guests to touch someone else’s hand, and one that used Microsoftt’s HoloLens AR headset paired with a LeapMotion to create a manipulable globe. While very simple, this was the most useful application of ultrasonic haptics I’ve seen, as it helped me position my hand correctly and provided decent feedback as I turned the virtual object.

Real Baby — Real Family

Perhaps the most operationally impressive — and adorable — demo at SIGGRAPH was Real Baby—Real Family. This type of trickery was more overt: an earnest attempt to make guests feel more comfortable handling a baby and perhaps having one of their own. After answering a pre-experience survey, the team snapped a picture of me and sat me down in front of a crib featuring one of the most disturbing placements of a Vive controller I’ve seen.

There was another brief survey in the VR experience, with the last question asking me about my interest in babies, giving me the option of hugging the baby to indicate the affirmative or leaving it in place to signal disinterest — I think I accidentally signaled I had no interest, but nonetheless, the experience launched with a baby whose face resembled my own, soliciting a laugh from most guests I observed but leading to an inevitable emotional connection. The baby, in need of some feeding and burping, provided some pretty impressive haptic feedback as it cried, drank, and burped. The best part: when I can out of VR, I was presented with a printed photo of my baby.

All in all, this was a really fun, warm experience that oozed with the sincerity of the creators to make guests feel more comfortable with having babies of their own, which is especially interesting given governmental efforts to increase the birth rate in the creators’ native Japan. Whether tied to these efforts of not, it certainly made an impact.

Shocking Your Mind:

GVS Ride

There were many warning signs. The release form claiming that the physiological and neurological effects were not understood. The guy in front of me warning me that “perhaps it’s a little too strong.” The team member handing me the battery and telling me I could pull out the USB cord if I felt uncomfortable because it might be “painful.” But this is SIGGRAPH. Time for an adventure!

Image courtesy Kazuma Aoyama et al.

This setup demonstrated the ability of four-pole galvanic vestibular stimulation (GVS) to provoke feelings of pitch and yaw in a guest, marketed as a substitute for a motion platform. The system works by stimulating the nerve that connects the inner ear to the brain, thereby manipulating a guest’s sense of balance. The demo was pretty simple: see a simulated roller coaster track in VR once without GVS and once with it.

Before the demo, there was a fairly laborious setup process that involved cleaning my face with alcohol to accommodate conductivity of the wet electrodes, two of which were attached to an Oculus HMD and two which were attached to hollowed-out headphones. Then I stood up and a team member placed the aforementioned emergency-stop USB cable — and my fate — in my hands.

What ensued was a pretty remarkable experience, which, while a bit painful by the end, was shockingly effective, really throwing me off balance to help me feel the acceleration of the roller coaster: a feeling likely enhanced by my racing heart. The subtle signals at the beginning before the first big drop were perhaps the most effective, as I didn’t even feel the current but did feel like I was spinning. By the end of the approximately 20 second experience, however, all I really felt was the current as the roller coaster came to a stop. While this demo undeniably needs some fine-tuning for intensity, the method shows promise.

Note: my nervous system went back to normal after perhaps an hour of tingling on my tongue.