Using VR to Creatively Enhance Biofeedback Treatments

Joshua Yang
Feb 4, 2019 · 4 min read

Clinical biofeedback has proven to be an effective way of treating anxiety, depression, and other disorders, and VR could help that even more by providing further immersive biofeedback visualizations. Scientific literature have shown that using tools such as temperature monitors and heart rate monitors to display a patient’s bodily functions in hopes of training patients to control these functions has helped the patient alleviate many of these disorders through repeated practice. Furthermore, a study done by the psychology department of Università Cattolica del Sacro Cuore showed that adding VR tools to this treatment helped patients better and faster learn relaxation techniques, resulting in more effective treatment.

These findings caught our attention, and as aspiring VR developers, we wanted to take biofeedback and enhance it creatively and effectively using the new paradigms of user-interaction and immersion that VR introduces. In Fall of 2018, we began forming a team to work with SDKs for EEGs and ECGs that integrate into the Unity game engine. With a combination of students with backgrounds in animation, VR development, and machine learning, we developed a minimal viable product, a VR game that utilizes the users’ EEG brain waves as a mode of control. This MVP currently works with the Oculus Rift, and uses the Muse headband, a highly portable consumer EEG device, to achieve these mappings.

A snapshot of our MVP, which utilizes users’ EEG data to control the tilt of a watering can in order to water a growing flower.

Throughout repeated testing, we have demonstrated progress at showcase events to gain additional feedback from users and learn about current bugs within our product.

In Fall 2019, we expanded our MVP and explored other ways to bring biofeedback to VR. Muse had recently pulled support for their developer SDK and our team was eager to explore other sensors we could integrate into our application. We also wanted to find a creative solution for relaxation exercises that could be done without any external sensors.

Our cover photo for Symbiosis.

Our team had already faced many difficulties trying to integrate bluetooth sensors into our original MVP so we decided to use an Arduino heart rate monitor (HRM) to expedite our progress. To make our game playable without any external sensors, our team made use of the Oculus Touch controllers, using them as a makeshift breathing band. By having users place their controllers on their diaphragm area, we could provide feedback based on the expansion and contraction of users’ diaphragm while they engage in deep breathing. We also built a new environment with a darker, nighttime environment to reduce eye strain on users.

One iteration of our second MVP, where the user conducts a breathing and yoga-esque exercise to decrease heart rate and grow a sacred tree.

We had the opportunity to submit a vertical slice of this new project to the Oculus Launch Pad program. As part of the 2020 cohort of ~100 talented VR developers and artists, we presented our project to Oculus, and got some positive feedback, especially on our controller breathing band idea. Currently, our team is revisiting the idea of adding support for commercial, more widely available body sensors such as the Apple Watch and the FitBit.

Claudia Korea and Joshua Yang presenting on our vertical slice at Facebook headquarters in Menlo Park, CA.

Our team also plans to collaborate with Berkeley’s Psychology department to gain expertise, to develop user testing, and to carry out studies to collect preliminary user data regarding Biofeedback VR. We are also interested in contacting companies and biofeedback practitioners who are working on or are interested in Biofeedback in VR to further enhance our minimum viable product.

After collecting user data, we would like to utilize our findings to increase the effectiveness of our game and integrate new mappings of our data to the virtual environment. Some ideas are to map data to spatial audio, and affect how fast time passes in a virtual environment.

As the application evolves and becomes more effective and polished, our team would also love to make a consumer-friendly version of the application that could run on not only Oculus Rift/HTC Vive, but also mobile phones in VR or in AR. Although these goals require further development, they could be important steps to take as a consumer application could help people who aren’t comfortable going through clinical treatment with their mental and physical issues.

Written by Joshua Yang and Claudia Korea, team leads of XR at Berkeley’s Biofeedback VR team.

Extended Reality at Berkeley

Explore Extended Reality @ Berkeley’s research, projects…

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store