Breath Sensor for Unity

Lots of setup, a few discoveries

Kyle Qian
Kyle Qian
Jul 25, 2017 · 5 min read

EDIT: Unity code for the Neulog belt can be found here.

Before the summer began, I had the pleasure of meeting Chris from JunoVR who introduced me to an Arduino-based breath sensor that he’d been working on. Instead of the bulky Neulog breath belt I had worked on the past summer, this sensor sits in front of your mouth like a microphone, its presence practically invisible while wearing a headset.

I haven’t fixed it to my Vive yet, so my hand will have to do for now.

The sensor essentially works by measuring temperature changes as a result of breathing, and is featured in this post. I, too, found that the belt-based sensor was too cumbersome and did not always match airflow, so I was very excited to work with a smaller, more direct sensor.

Reading from COM ports

Whereas last summer I had the privilege of using an API, this time I had the opportunity to start from a level lower: COM ports. In short, COM ports (or serial ports) are a standard interface through which an Arduino can communicate with a computer. Readings from the sensor can be written to and subsequently read from a computer’s COM port.

However, while data from COM ports can be easily read and processed synchronously or by blocking, that’s not how Unity works by default. Most scripted game behavior in Unity is specified through within an Update method, which executes exactly once per frame. Given an application that runs at 60 fps, this means one execution per ~17 ms. In the Arduino sketch, we can specify a delay of a few ms per loop to stabilize the data stream, to say, one write per 3ms. Here, if the write rate is much slower than the read (Update) rate, Unity will constantly re-read data, which will seem choppy and step-wise; if the write rate is much higher, the serial buffer will fill up faster than Unity can read from it, leading to delayed response.

I first tried picking a write delay somewhat close to 17 and it appeared to work, but it’s somewhat of a hacky solution that’s tied to fps. I then purposefully set a slower read rate with the intention of reading the entire buffer each time and picking out the earliest entry. But unfortunately, the .NET SerialPort class seems somewhat broken when used within Unity, and the ReadExisting method in particular doesn’t work.

Out of options for directly reading from the COM port, I ended up implementing a simple threaded solution that I’m currently using. It took a bit more tinkering because I initially used lock statements around the de/enqueuing code, which were causing the Unity editor to freeze on play (locking too expensive? broken?). I eventually removed the locks anyway since race conditions aren’t a big deal here.

A non-Monobehavior thread worker meant to be executed independently of Update.

I could have also stored just a primitive instead of a Queue, and it would have been more robust, but a Queue allows me to retrieve multiple readings at once in the future if needed.

Hello world and beyond

Hello world!

With the sensor ready to go, I made a short script to blow up a sphere to match the breath readings. The sphere appears a bit jittery due to some sensor noise, but more importantly I felt no discernible delay between breathing and seeing.

The lack of stereoscopic depth perception was quite noticeable.

Now onto what I originally wanted to play with: the Vive’s passthrough camera. Luckily, I discovered that the SteamVR plugin already includes a demo scene utilizing the camera, so I loaded it up.

It was one of the most bizarre experiences I’d ever had in VR.

As seen in the gif, the demo scene featured a projection of the video feed onto a quad that followed my gaze. The effect was augmented virtuality-esque, in that I was watching scenes from the real world projected into virtual space. But the visual alone wasn’t the strangest part —it was the sensation of physically interacting with a world inside a video feed. It felt as if I were watching someone else’s hand pick up a mug, only to feel the weight and texture of that very mug in my own hand.

That said, the feed was quite grainy and there was a noticeable lag with the camera, though it wasn’t super disorienting. The Vive only has one camera, but the complete lack of depth perception wasn’t particularly problematic until I tried to do hand-eye coordination tasks, like trying to catch my phone. I was still rather impressed, considering that the Vive camera seems like a afterthought feature that no one uses. If anything, I can see a future in which a mounted 360-degree camera combined with sophisticated haptics can convince someone they’re somewhere they’re not.

???

To tie thing up, I linked up the video feed to the breath sensor, such that breathing out would cause the feed to zoom away. Admittedly this scene was less trippy than the vanilla video scene, and I think it’s because I associate less with the footage as it zooms too far and becomes too grainy. It felt more like I was watching an pulsing rectangle with patterns on it, as opposed to watching my actual perspective. The quad also got a lot more jittery as I moved my head while it’s zoomed out.

Takeaways

It took me a while of tinkering to get the sensor communication to where I wanted, so I haven’t experimented much yet. I’m glad I checked out the camera scene though, as it confirms that even subtly messing with perspective and composition of senses can lead to fascinating perceptual oddities. A compelling link between that and breath feedback, however, remains unclear to me. And so we keep building.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade