The Sadness Room: Exploring Sadness Using Virtual Reality

Ifraah Dhegadub
Macalester HCI
Published in
10 min readMar 31, 2023

A'di Dust, Ifraah Dhegadub, Isaac Wan, Sheilla Kangwagye

Check out our prototype here: https://hci-design-sprint-3---designing-for-emotion.glitch.me. It is compatible with Android devices and Desktops!

Link to our prototype’s code: https://glitch.com/edit/#!/hci-design-sprint-3---designing-for-emotion

A picture of the team brainstorming how to make the Sadness Room come to life!

Overview

There are many emotions we as humans feel very deeply. From happiness to anger, we express these feelings in different ways and it can vary from person to person. What if you could use Virtual Reality (VR) to create a space to emulate an emotion? That is what our team did in our Human-Computer Interaction course at Macalester College. We agreed on focusing on one of the most universal emotions that everyone has felt at some point in their lives … sadness!

Our project was broken down into stages that helped organize our design sprint which can be seen down below.

GIF of the design process overview
Our Design Process

As seen at the very beginning, we needed to hone in our objective. This meant clearly articulating our goal so that we don’t stray too far from our original idea. We posed the following questions for our group to answer individually:

What does sadness mean to you?

How do you interact with your space when you are sad?

What sounds, images, and lighting comes to mind when you think about sadness?

These questions were intentionally designed to be open-ended and vague so we as a group could create a cohesive vision of our Sadness Room. We decided that our main goal was to create a space that would reflect one version of sadness through the use of object placement, sound, and lighting. Then we agreed on using A-Frame, a web platform that builds VR environments, to create our prototype of the room. Despite our excitement to start building, we needed to explore some secondary research on existing projects surrounding 360° videos and VR spaces.

Secondary Research

Since A-Frame would be the platform we would use to build our prototype, we explored some of A-Frame’s many examples to get a feel for the bounds and constraints in our project. We considered making a room from scratch using objects on the platform or creating a space entirely in person. Ultimately, we decided that importing our own video and utilizing A-Frame’s existing technology for 360° videos would be the best path forward due to time constraints and prioritizing later stages of the project. By importing our own video, we could focus on manipulating and shaping the room to best demonstrate our version of sadness.

These questions were intentionally designed to be open-ended and vague so we as a group could create a cohesive vision of our Sadness Room. We decided that our main goal was to create a space that would reflect one version of sadness through the use of object placement, sound, and lighting. Then we agreed on using A-Frame, a web platform that builds VR environments, to create our prototype of the room. Despite our excitement to start building, we needed to explore some secondary research on existing projects surrounding 360° videos and VR spaces.

A screenshot of a AFrame 360 VR video of sand dunes
A screenshot of a AFrame 360 VR video of sand dunes
A screenshot of a AFrame 360 VR video replicating a landscape with a mountainous terrain
A screenshot of a AFrame 360 VR video replicating a landscape with a mountainous terrain

One of our stated purposes for using VR to embody sadness is to encourage empathy. The article Empathy and embodied experience in a virtual environment: To what extent can virtual reality stimulate empathy and embodied experience? This supports the hypothesis that virtual reality and embodied space can promote empathy at high levels. The flow of the experience also contributes highly to empathy, so we tried to create an experience that feels natural.

Prototyping

For our prototype, we started by configuring the bedroom where we would shoot our 360° video to use in A-Frame. In terms of lighting, we knew we wanted the room to be dark since this is what we think of in terms of sadness. Another trait we had in common about sadness was disorganization. By shutting the bedroom’s blinds, tossing clothing on the ground, and messing up the bed sheets, we were able to successfully create our vision of what this person’s bedroom would look like. Along with this, we wanted to give this sad person’s bedroom a sense of identity and personality. Scattered objects on the desk, such as a coding interview textbook, a rent bill as well as hand drawings on the floor gave the bedroom much more individuality. Using a 360° camera (Ricoh Theta S) attached to a tripod, one group member held the camera above their head and walked around the room in order to record a 360° video.

360° camera similar to the one we used
Ricoh Theta S 360° camera, which is the device we used to record

After some trial and error, we finally got the video we needed and edited it in Adobe Premiere to add some background noise. When discussing sounds that convey sadness, two parents arguing with each other seemed pretty indicative of melancholy, so it was perfect for an audio. After some searching, we found one from Freesound that accurately conveyed the energy we were going for. Once we had our completed 360° video, the final step was to import it into A-Frame as an asset and alter A-Frame’s 360° video example code to suit our prototype’s needs. For example, the autoplay feature was buggy with our imported video so we changed it to a play button for users to setup their VR device and watch it when they were ready. While our current prototype may not be as polished as we would have wished, it is still functional and usable for user testing.

Critical Design Analysis

We did some critical design analysis on our prototype through the help of the Artefact Group’s Tarot Cards of Tech. These tarot cards help organizations and teams explore ethical consequences and impacts their designs could have on various communities. We selected two cards that we thought would yield the most interesting analysis of our prototype.

A GIF displaying The Forgotten card from the Tarot Cards of Tech
The Forgotten card from the Tarot Cards of Tech

First up was “The Forgotten” card and it addressed any groups or perspectives missing and how our prototype could exclude them. There are a couple groups of people that would not have access to our design or it would not be accessible to them.

Some of the groups we brainstormed were:

Users without access to Google Cardboard or a smartphone device

  • Users without internet access
  • Users with limited vision and blindness
  • Users with hearing difficulties.

These user groups would not get a chance to use the prototype or experience the VR space to its full extent, so acknowledging them allows us to design ethically and incorporate changes that are more inclusive in the future. In addition, in terms of negative impacts, there is a possibility the Sadness Room could cause trauma-related stress based on one’s experiences and history. Trigger warnings and disclaimers are one way to tackle this case which is something we implemented later in the design sprint.

A GIF displaying The BFFs card from the Tarot Cards of Tech
The BFFs card from the Tarot Cards of Tech

The second card we selected was “The BFFs”. This card helps teams look into ways their product shapes interactions and connections with people. One of the questions was “If two friends use your product, how could it enhance or detract from their relationship?” We found that our prototype could have a mix of positive and negative effects on a relationship. Here are a few pros and cons we brainstormed as a group:

A list of positives and negatives effects that our prototype could have on a relationship
A list of positives and negatives effects that our prototype could have on a relationship

By playing around with the Tarot Cards of Tech, we were able to perform a critical design analysis on our prototype, explore ethical implications of our design, and prepare us for usability tests.

User Testing

We made a pitch video that demonstrated our prototype and shows how it works.

Our initial user feedback was gathered in class, as we presented a demo video of the project. Students then gave us comments on “I liked”, “I wish”, and “What if” for us to refine our idea and define the important parts to keep and important parts to change. Some common themes from sticky notes are shown below:

I like…

  • The sound in conjunction with the room
  • The immersiveness of the experience
  • The attention to detail
  • The discomfort it created

I wish…

  • It were not as traumatic
  • The project goals were clear
  • There was a disclaimer
  • There was more interaction

What if…

  • There were more story explorations
  • There was more freedom to explore elsewhere (e.g. other rooms in the house)
Positive reactions from prototype
Positive reactions from prototype
Feedback from user testing in class
Feedback from user testing in class

We also conducted user testing with four individuals. Three were students and one worked in the school system. Participants were asked about their emotions, the parts they liked and disliked, and what (if anything) they could imagine using it for.

Most of the participants in user testing expressed frustration, anxiety, disarray, nervousness, and sadness. One person described feeling sad when the audio was on but then calm after the audio turned off. The user testing and post-it notes activity made it clear that the emotion we were portraying was closer to stress or trauma than to sadness.

Participants enjoyed that the VR allowed for changes in point of view which lead to further autonomy, but they also enjoyed being guided through the room as the emotional intensity could make it hard to interact at first. This was especially prevalent when participants experienced the room with and then without sound.

“I appreciated noticing the difference in how I was physiologically reacting with the noise versus without the noise. I felt calmer without the noise physiologically, but it was more disorienting.” — One of the user testers

User testers did not enjoy how disorienting and dizzying the experience was, how intense it could be, and were confused about what it is intended to be and the demographic depicted.

Interestingly, the users came up with interesting use cases for our VR experience that we had not thought about ourselves — demonstrating how important understanding stakeholders is. The testers described instances in which it could be used to solve problems.

One user mentioned that it could recreate the experience of deafness and trauma. The professional who works in a school went at length about how it could be used for experiencing empathy for students as teachers who might not have gone through similar experiences.

“When a kid shuts down in a classroom we have to understand why that kid is shutting down. If a teacher hasn’t experienced this something immersive like this could help them understand why they are experiencing it and how to help.”

— Professional

Reflection

A picture of teammates

Reflecting upon what users thought the use case was, we transitioned from the lack of sound in the second loop being a bug to being intentional by choosing not to loop it. We also changed it so that a disclaimer was made clear in the beginning and the user can not proceed before acknowledging that disclaimer. Similarly, we added text in the beginning that made it more clear what the story was and how it is to be used.

Overall, in this design sprint, we learned about turning something ambiguous (i.e. a basic human emotion) into an achievable goal. We were able to express sadness in a Virtual Reality space through the use of A-Frame and a 360° camera. We are happy with what we were able to accomplish in two weeks, but there are a couple things we would improve and expand on.

These are a few future steps for us to consider:

  1. We would love to explore sadness in spaces besides a room. Maybe at school or work, utilizing a hallway or office backdrop with different sounds. In addition, we could switch to other experiences like stress.
  2. We wish we had better video quality from the camera since it impacts how it looks in VR, so maybe experimenting with different cameras would help.
  3. We only showcased one version of sadness and it is a bit of a gray area so we would like to show other ways it manifests that are not so common.
  4. We also would like to do more research into if these VR spaces create more empathy for certain experiences and situations people may be going through.

Works Cited

Frame School. A. (n.d.). Retrieved March 30, 2023, from https://aframe.io/aframe-school/#/1

Neighbors.aiff by Carminooch. Freesound. (n.d.). Retrieved March 30, 2023, from https://freesound.org/people/carminooch/sounds/105265/

The tarot cards of Tech. The Tarot Cards Of Tech. (n.d.). Retrieved March 30, 2023, from https://tarotcardsoftech.artefactgroup.com/

Shin, Donghee. “Empathy and embodied experience in virtual environment: To what extent can virtual reality stimulate empathy and embodied experience?.” Computers in human behavior 78 (2018): 64–73.

Thanks for reading! ❤

--

--