Exploring Emotions Using VR — Through The Lens Of A Color Blind Person

Andrew Nguyen
Macalester HCI
Published in
10 min readApr 3, 2023

Contributors: @Jriosdel, @srashidi, Matthew Trager, Johan Azambou

Link to prototype: https://anguyen6262.github.io/Red/

Note: To run this program on iPhone please download google chrome or Firefox.

Overview

Emotions are a fundamental aspect of human experience and can be triggered by a variety of external or internal factors. At a basic level, emotions are often classified into six primary categories: happiness, anger, fear, surprise, sadness, and disgust. However, there are many more nuanced emotions that can be experienced by individuals. This is where virtual reality (VR) comes into play. By integrating VR into our projects, we can create an immersive simulation where users can interact with computer generated environments and 3D models. VR is perfect for immersive experiences, and we believe that it is a perfect medium to explore emotion.

That said, in order to further explore human emotions through VR, our team decided to use the color red to create a digital experience that brings out particular emotions in users using the JavaScript library, A-Frame, in conjunction with a Google cardboard. By doing this, we hoped to explore the relation between color and emotion. The project is part of a two-week design sprint in our Human-Computer Interaction class and below is a detailed process of the steps we took in order to achieve our goal.

Design Process

Animated heading with the text of “Project Idea”

The prompt for this design sprint was to make a design for understanding/emotion. The idea that we gravitated towards was an exploration of a single color, so that we could better understand the emotional impact that color has. We ultimately decided to focus on the color red, due to the variety of meanings and feelings that red instills in us. During our initial brainstorming session, we outlined the following aspects of the color red, among others:

  • Danger
  • Courage
  • Anger
  • Hunger
  • Love

However, due to time constraints and our inexperience with A-Frame, we started by exploring danger.

We intended for the user base of this project to be just about anyone. However, we were especially interested in how someone with colorblindness would be able to understand the emotions a color conveyed. Would a VR experience centered around the color red be as effective for someone unable to see the color?

Animated heading with the text of “AR/VR Exploration A-Frame”

To gain a better understanding of A-frame and its features, we began by examining the A-Frame tutorial page and exploring examples that were relevant to our project concept. Specifically, we focused on the Responsive UI and Moon Rider examples, which included animated video models, images, and shapes that were the foundation of our prototype. Despite some of us having previous experience with HTML, it was necessary to examine the source codes of these projects in order to become familiar with how to animate scenarios and execute them with A-frame using libraries.

A-Frame has a useful feature called the local inspector, which allows developers to visualize all the elements in a project. This tool enables users to break down the implementation of different components, including audio, 3D models, lighting, animations, and positioning in the environment. Additionally, to create an interactive environment, developers need to write JavaScript functions that enable the user to interact with various elements. In this project, the primary interaction is between the user and the alarm clock (you will learn more about in the following sections). When the user clicks the alarm clock for the first time, the ticking sound will stop, and an alarm sound will play. If the user clicks the alarm clock for a second time, the audio will stop altogether.

Picture 1: A black, teal, and purple road engulfed in an outer space environment. Picture 2: Three posters of popular anime movies.
Animated heading with the text of ”Case Study”

After becoming proficient with A-Frame, we aimed to gather information to enhance our comprehension of how an individual with color blindness, specifically someone who couldn’t properly distinguish the color red, perceives the world. Fortunately, we were able to find someone named Bob who is color blind: unable to see red and green. Bob served as our case study with his permission, giving him the option to answer questions he felt more comfortable with. We conducted an interview with Bob to gain insight into how he perceives the world without the color red.

Interviewer: What does the world look like to you, not being able to see red or green?

Bob : I mean, it looks just like how it looks. I’ve always seen the world this way, so it’s difficult to explain exactly what it looks like.

Interviewer: How do you see red and green?

Bob: Yeah, I guess I see red and green as brown. So if you were seeing through my eyes, you’d see everything as brown and dull. And I can tell the difference between red and green a lot of the time–it’s like being able to tell shades of brown apart. But sometimes I get confused.

Interviewer: What are some things you get confused about?

Bob: Uh, usually when something is supposed to be a color that it’s not. Like I know that Coke cans are red, but if you gave me a green coke can, I might not notice for a couple of minutes. But I know when apples are red or green, and I know that some colors match better with others because people have told me. I can basically infer and use context to guess what the color of something is. But like sometimes when I watch sports, or something, I have to squint and spend some time distinguishing which players are on which team.

Interviewer: What sort of things do you associate with the color red?

Bob: I guess the things that everyone else does. Like if there’s a big red sign, I might not notice it as well. I have to pay attention when I’m driving a lot, haha. But I still associate red with things like “halt” or “danger”, even if I don’t always notice that the warning sign is red. Does that make sense?

Interviewer: How did you come to associate red with these things?

Bob: I mean, probably how everyone else does it. Like everyone tells you, “red is like a color for love and hearts.” And then I’ve always seen red stuff on Valentine’s day. And that’s probably how you learned about the color red and love too.

Interviewer: Okay, thanks for talking to me. Anything else you want to say?

Bob: No, hope that helped!

Following our interview with Bob, we came up with the following conclusions and realizations:

  1. Bob’s experiences of being unable to see red or green highlights how our perception of the world is shaped by our other senses and experiences.
  2. Through the use of VR, we can create digital content that can evoke specific emotions in users by leveraging the power of what red represents.
  3. By using A-Frame/Google Cardboard, we can develop immersive experiences that allow users to interact with the environment and feel the emotions that are being communicated through digital content.
Animated heading with the text of “Ideation and Brainstorming”

In this stage, we utilized the insights we obtained from our interview with Bob as a reference point to employ 3D models sourced from Sketchfab — an online platform that facilitates the viewing, creation, and distribution of 3D models — to investigate how feelings of danger or fear can be experienced in VR.

We had a couple of ideas about how to convey an emotion through VR. The first was to have a museum-like display of elements associated with danger. A second idea was to have a 360 degree image viewer that the user would be able to float around in. And finally we thought of creating a scene to play out an event/scenario. Because we were interested in the modeling and animation components of A-Frame, we ultimately created a VR scene to make the user feel a sense of danger and unease.

Sketches of possible features and elements we wanted to include in our danger room.

So, our main focus was to design a danger room that would evoke the emotion of fear, and with heavy emphasis on the color red. We brainstormed ideas and decided to incorporate an alarm clock, a spider, red lighting, and an exit room. These were all models we had considered during the first brainstorming session. Luckily, free versions were available. Although we weren’t entirely certain of the room’s appearance, we had a clear vision of what features we wanted it to have.

3D models of a hallway with red lights, spider, red alarm clock, and hanging lamp.
  • Our design would include the ability for the user to walk around and explore the room, where the ambient lighting would be red.
  • An alarm clock that would sound upon clicking to indicate the possibility of danger. Before it is interacted with, there are ominous ticks.
  • A spider that would move towards the player, stop, and then continue its approach.
  • Additionally, we planned to make the lamp appear broken and give it a flickering effect, making it alternately bright and dim.

After we had a clear understanding of our objectives and the room’s features, we proceeded to develop our prototype using Visual Studio Code and A-Frame.

Animated heading with the text of “Prototype And Video Demo”

Due to the project’s size and the time constraints, we focused on linking the spider, clock, and lights together to make the room interactive. The clock, for example, could be triggered by a user click, which would activate other aspects of the room. One feature that we were particularly excited about was the sound, which we thought would create an uneasy feeling in the user.

Because of the importance of the sound to the danger room experience, we made sure that the audio worked on all devices. And after numerous attempts, we were able to successfully run our prototype on both iPhone and Android devices. However, we did encounter one issue with the sound only being audible on Android devices. We discovered that the sound files we used were more compatible with Google Chrome. Additionally, you need to touch the screen one time for the audio to start playing.

Below is a video demonstration of our prototype.

Animated heading with the text of “User Testing”

Initially, we created a working prototype for mobile and then asked testers to use a Google Cardboard connected to a phone. We observed the testers and asked them to speak out their thoughts about their interactions with the prototype without providing any additional details. However, we encountered some problems during the testing phase.

Some testers began walking in real-time because they assumed that they would also be walking in virtual reality (VR). This confusion occurred because the environment was too dark, making it difficult for users to realize that they were not actually walking in VR. Additionally, the short duration of our project did not allow users to adjust to VR and the environment before experiencing the spider interaction. By the time users scanned their surroundings, the spider interaction was over.

We also overlooked the design flaw of orienting users towards a wall rather than the hall, which was counterintuitive. This also caused confusion and we immediately corrected this flaw by rotating the user towards the hall.

Furthermore, our prototype lacked a clear method of indicating user interactions and how to initiate them. As a result, many users were unaware of the clock interactions and functionality. We did not explain that interactions were activated by “fusing,” which meant that the cursor had to hover over an object for a specified amount of time to trigger an event.

To address these issues in future projects, we decided that a start screen would be the best solution. This screen would provide key details about the VR environment and allow users to adjust before experiencing our prototype.

Feedback

During our class presentation, we received some general feedback, which we would like to share. In response to the feedback, we made some changes to our project. For example, we addressed some of the issues mentioned in the “I wish” section, such as making the room brighter and ensuring that users are able to move around the room. Additionally, we experimented with different lighting setups to highlight the flickering aspect of our project.

Critiques and compliments about our danger room, from other students, written on red, blue, and orange sticky notes.

Reflection and Final Analysis

This project shows that VR technology has the potential to create immersive experiences that can evoke emotions in users through the use of color. It also highlights how our senses and experiences shape our perception of the world. To design an inclusive and accessible experience, our team interviewed a color-blind individual to understand how they perceive the world. This approach helped us gain valuable insights and incorporate them into our ideation process. By doing this, we demonstrate our commitment to creating digital content that is accessible and inclusive, an important aspect of human-computer interaction. Overall, this project showcases the potential of VR to create experiences that evoke emotions in users while also prioritizing inclusivity and accessibility.

That being said, here are some considerations we had in mind at the end of this project

  1. Going beyond fear and exploring other emotions such as happiness, surprise, sadness, and disgust using VR
  2. Designing the virtual rooms from scratch and tailoring it to best fit the emotions
  3. Expanding the rooms to include all the emotions we associated with the color red.
  4. Ensuring our prototype is compatible with every device

Thank you for reading. We look forward to reading your thoughts!

Sources for the different assets used:

Apocalyptic Hospital Hallway Interior (ETBENO): https://sketchfab.com/3d-models/apocalyptic-hospital-hallway-interior-15352de0e59d4046aa41e08db7384a63

Low Poly simple ceiling lamp (imperioame): https://sketchfab.com/3d-models/low-poly-simple-ceiling-lamp-4dfe531deb234506a8a26fe410c18252

Alarm clock (klaxoneer): https://sketchfab.com/3d-models/alarm-clock-anim-4c820629af294e50af725ad1f5f01cb0

Spider (jonmariano21)

https://sketchfab.com/3d-models/hi-fi-spider-ff8a4433a5d449a3a0fc54989185a024

--

--