Using VR to Creatively Enhance Biofeedback Treatments

Joshua Yang
Extended Reality @ Berkeley
4 min readFeb 4, 2019

Clinical biofeedback has proven to be an effective way of treating anxiety, depression, and other disorders, and VR could help that even more by providing further immersive biofeedback visualizations. Scientific literature have shown that using tools such as temperature monitors and heart rate monitors to display a patient’s bodily functions in hopes of training patients to control these functions has helped the patient alleviate many of these disorders through repeated practice. Furthermore, a study done by the psychology department of Università Cattolica del Sacro Cuore showed that adding VR tools to this treatment helped patients better and faster learn relaxation techniques, resulting in more effective treatment.

These findings caught our attention, and as aspiring VR developers, we wanted to take biofeedback and enhance it creatively and effectively using the new paradigms of user-interaction and immersion that VR introduces. In Fall of 2018, we began forming a team to work with SDKs for EEGs and ECGs that integrate into the Unity game engine. With a combination of students with backgrounds in animation, VR development, and machine learning, we developed a minimal viable product, a VR game that utilizes the users’ EEG brain waves as a mode of control. This MVP works with the Oculus Rift, and uses the Muse headband, a highly portable consumer EEG device, to achieve these mappings.

A snapshot of our MVP, which utilizes users’ EEG data to control the tilt of a watering can in order to water a growing flower.

Throughout repeated testing, we have demonstrated progress at showcase events to gain additional feedback from users and learn about current bugs within our product.

In Fall 2019, we created a new MVP and explored other ways to bring biofeedback to VR. Muse had recently pulled support for their developer SDK and our team was eager to explore other sensors we could integrate into our application. We also wanted to find a creative solution for relaxation exercises that could be done without any external sensors.

Our team had already faced many difficulties trying to integrate bluetooth sensors into our first MVP so we decided to use an Arduino heart rate monitor (HRM) to expedite our progress. Furthermore, to make our game playable without any external sensors, our team made use of the Oculus Touch controllers, using them as a makeshift breathing band. By having users place their controllers on their diaphragm area, we could provide feedback based on the expansion and contraction of users’ diaphragm while they engage in deep breathing. We also built a new environment with a darker, nighttime environment to reduce eye strain on users.

One iteration of our second MVP, where the user conducts a breathing and yoga-esque exercise to decrease heart rate and grow a sacred tree.

We had the opportunity to submit a vertical slice of this new project to the Oculus Launch Pad program. As part of the 2020 cohort of ~100 talented VR developers and artists, we presented our project to Oculus, and got some positive feedback, especially on our controller breathing band idea.

Claudia Korea and Joshua Yang presenting on our vertical slice at Facebook headquarters in Menlo Park, CA.

Currently, our team is expanding our second MVP into Blossom VR, a biofeedback relaxation game that has support for the Apple Watch as a heart rate biosensor.

An image from our Instagram that describes Blossom VR, which will be released on the Oculus store in 2021.

After collecting user-tester biosensor data, we would like to utilize our findings to increase the effectiveness of Blossom VR and integrate new mappings of our data to the virtual environment. Some ideas are to map data to spatial audio, and affect how fast time passes in a virtual environment.

As the application evolves and becomes more effective and polished, our team would also love to make a consumer-friendly version of our game that could run on not only Oculus Rift/HTC Vive, but also mobile phones in VR or in AR. Although these goals require further development, they could be important steps to take as a consumer application could help people who aren’t comfortable going through clinical treatment with their mental and physical issues.

Written by Joshua Yang and Claudia Korea, team leads of XR at Berkeley’s Biofeedback VR team.

--

--