Designing the Future Through Immersive VR

Andrew Gray
Andrew Gray’s HCI Work
9 min readApr 25, 2019

This project was developed under the guidance of Professor Evan Peck in his computer science elective course Human-Computer Interaction held at Bucknell University. The purpose of this project is to design something that embodies what we think VR should be.

Picture from Jabil

Background:

When we think of VR, we all imagine something different. Many think that VR is meant to place us into environments that as humans we’ll never be able to go. Some want to skydive, fly a plane, or experience some sort of danger all from the safety of their own homes. And others want VR to help them join communities, build relationships, and collaborate with others. In today’s society, there is no limit to what VR can do.

https://tenor.com/search/vr-gifs

Our Goal:

My team envisioned VR as an immersive experience. We wanted to feel like we were actually in the game rather than playing one. With this goal in mind, we began brainstorming options:

  • Escape room with hidden clues
  • Solar system explorer
  • Zombie shooter

After some thought, the escape room would have been too strenuous to develop and required numerous assets that we wouldn’t be able to obtain without paying. That said, we didn’t feel that we’d have enough resources and time to deliver an escape room to our satisfaction in the two-week sprint that we were given. As for the solar system explorer, there wasn’t a consensus on whether or not we wanted to move forward with the idea. Several other teams in our class were developing exploration VR and we wanted to try something different. That’s why we decided to go with the zombie shooter.

The Game:

We used this opportunity to design an immersive 3d audio game, one that provides a sense of fear as the user is dropped into a first-person survival scenario. The 3d audio creates an alternative navigation method in addition to increasing the user’s involvement in the scenario. We used sounds like ambient noise and the footsteps of approaching enemies to place the user into an adrenaline-pumping environment they won’t forget.

Actual screenshot from our game

The scene of the game is set in a scary dark forest. The user is locked in place and equipped with a small flashlight. Staying alert is key as audio cues will notify them where incoming enemies (zombies) are coming from. The only way to stop the attack is to shine the ever-dying light directly at them. If the enemy gets too close, game over. Since the flashlight needs time to charge after each use, the user will want to reserve the battery as much as possible. This means darkness will consume much of the environment invoking both fear and excitement!

Our Development Process:

Although our professor recommended using AFrame to create our VR scenes, we decided to go with Unity as our game development platform instead. As for the VR kit, we used Google Cardboard. It is a simple and affordable way to experience immersive VR.

Development tools

Why we chose Unity over AFrame

Our project was focused on an interactive gaming experience, so it makes sense that we would use a platform that is designed to create games. Unity has built-in support for Google Cardboard as well as a slew of tools available for us to develop with. One of the main motivating factors to use Unity is our focus on 3d audio.

AFrame, on the other hand, is built using HTML and although our team was confident in our ability to create this experience, Unity offers free tools to immediately have the audio built in. For a project with such a small development cycle, taking shortcuts like this was paramount to our success. Additionally, Unity offers free game assets complete with animations. We want to deliver as compelling of an experience as possible so using premade assets like our zombie and nature scene allowed us to focus much more on the meat of the project.

Unity overall allowed us to iterate and design our game much faster than if we were to use AFrame. It gives us access to very powerful tools that accelerated our development speed and delivered a much more complete and fleshed-out product at the end.

Creating 3d audio

Because our game was focused on immersion, the delivery of our assets was key to the success of the project. Our key design principle was focused around the audio design so principles like randomness, variety, and theme were very important to our design. Our zombies were given several audio files through a free sound pack that would be randomly chosen from each time it makes a noise.

This provides a diverse experience where each playthrough has some difference from the last. This pack includes footsteps, zombie growls, and snapping twigs/leaves. We had initially wanted to add in soft background noise to add to the ambiance, but we found that it made the faint noises from far away zombies difficult to hear and was ultimately scrapped.

Roadblocks along the way

Despite being very happy with the sound design and the ability of a person to immediately know the direction of an approaching enemy from a sound, we ran into several issues with the implementation of the audio itself. These originated from the scripting in unity and from attempting to attach multiple sound files to zombies. We had issues with getting them to randomly play audio after they have spawned which was a key part of our design that allowed people to react to the approaching entities. The main reason for this was the small amount of time we had to debug and resolve these problems. Given more time to iterate on the design, we would have had this resolved.

Setting the scene…with cubes!

We began developing with a demo that came with a free asset we got online. It was a simple VR demo that placed the user in a small room with a cube that emits a constant sound. It came with 3d audio implemented, and when the user looked at and clicked on the cube it would move somewhere else.

Cube game demo

Since we wanted to create zombies that the user must find based on sound, this seemed like a great place to start. From here, we made the cube move. For our game, we wanted zombies to spawn randomly then all converge on the user in the center. We were able to identify the cube’s location in 3d space relative to the user and have it move slowly towards them over time. Once this was working, we created multiple copies of the cube since we wanted to have multiple zombies in the game at once. By the end of our modifications to this demo, we had several cubes spawning randomly around the user and all moving towards the user over time, all the while emitting a sound that could help the user find their locations.

Creating our dark forest environment

Now that we had 3d audio and motion of multiple objects, we needed to implement the forest scene and replace the cubes with zombies. To do this, we loaded in the free assets we found online to create our own forest, with hand placed trees around the user.

We also downloaded a free skybox, which showed the moon overhead and a dark, starry night all around. Next, we replaced the cubes with the zombie asset we had, which already came with animations. We changed the sounds they made to be zombie sounds rather than cube sounds (don’t ask me what a cube sound is but it doesn’t sound like a zombie). This gave us a very nice result of a dark forest with zombies spawning around the user and steadily approaching while making noise.

The final step was implementing the flashlight, which we also found a free asset online for. However, this was simply a beam of light coming from the user. We needed to toggle it on and off when the user clicked on the screen. We wrote a script that simply toggled the light on when the user clicked on the screen, then turned it off again after a second. The user could not turn the light back on again until they waited another full second. Just shining light wasn’t enough though, we wanted the user to be able to scare away or knock over zombies when the light hit them.

Accomplishing this proved to be much harder than we originally anticipated. There was no easy way to tell when a zombie was within the light coming from the flashlight. We had to create a cone object that was invisible and perfectly lined up with the light from the flashlight (which took the shape of a cone as it extended outward into the scene).

Cone for zombie elimination detection

Then, we were able to detect when a zombie passed within the boundaries of the cone. If this happened when the flashlight was toggled on, we activated the falling animation of the zombie and took them out of the game.

User testing and feedback

We conducted user testing around three questions that our uses. The question prompts were: I like, I wish, and what if? I like is some design choice or feature that the user liked during the testing. I wish is something that might be infeasible that the user would like changed. This change might be one in a design approach or game selection. What if is a feasible change that the user believes should be implemented in the given game/controls.

Overall, the feedback was very positive. We anticipated most, if not all of the “I wish..” suggestions and our users were genuinely scared and excited by the game. Success!! Below is a live reaction from a user while playing the game.

Our Final Results:

The zombie shooter ended up working better than we expected given the timeframe. The 3D audio, animations, camera, and light controls were embraced with open arms as our prototype received very positive feedback. The task for our project was to create something immersive with 3D audio. We think we accomplished this since everyone who tested our game had to listen to the audio to spot the zombies and were in fact scared by them too!

The majority of our final prototype went really well. We successfully created a scary dark forest scene. All of the zombie animations worked as expected including walking and falling down. The flashlight on/off controls functioned flawlessly. Plus, there was a variety of audio playing for different zombies rather than the same one over and over. We are very happy with how our game turned out.

If we had more time

Although the 3D audio worked, it was not as perfect as we had hoped and with some extra time could be perfected. With this extra time, we would have also made the scene even darker than it already was without the flashlight on. In addition, we would have implemented a game over screen when the zombies attacked the user. And lastly, we would have spent more time on collision detection for the trees and the player so that zombies couldn’t walk through trees and would attack the player when near.

The Team: Matthew Harmon, Dan Kershner, Bhagawat Acharya, Andrew Gray

For those who want access to our git repository, our code for this project can be found here: https://gitlab.bucknell.edu/dtk008/hci_vr

Big thanks to Professor Peck!

--

--

Andrew Gray
Andrew Gray’s HCI Work

UX Designer • Builder of useful, user-friendly interfaces • Living embodiment of a golden retriever • Music fiend • Social gamer • he/him 🏳️‍🌈