Shadows of a Stranger VR experience created by Andy Taylor


“Shadows of a Stranger VR Experience” is a mobile VR application based on the opening scene of the indie movie “Shadows of a Stanger”. The application was created by myself, and is designed to work on smart phones with a Google Cardboard headset. To build the experience, I predominantly used Unity Game Engine, Blender and Adobe Photoshop, along with MonoDevelop for scripting in C# language.

The aim of this document is to give a general feel for how the application works and how it was created. Before I delve into the process, please take a moment to view two short clips. Firstly, the movie scene that the experience is based on, and secondly a video run-through of the application in its current form. Please be aware that the movie, and therefore the app, contains adult themes which some people may find disturbing.

The Idea

The movie was created by Chris Clark and Richard Dutton, and is a dark psychological thriller with a Christmas theme, set in the mythical city of Meridian. It features household names such as Colin Baker (Doctor Who) and Colin McFarlane (Batman Begins). Actors were filmed against a green screen and later edited with the CGI scenes and backgrounds akin to the Hollywood movie Sin City.

I had been interested in creating a VR experience where the end user gets the opportunity to step into a movie scene and essentially feel how the onscreen character would have felt. I thought Shadows of a Stranger was the ideal movie for this, as it was predominantly CGI based. This meant there would be lots of artwork that I could use for assets and materials. I contacted the film makers to discuss my ideas and they jumped at the opportunity.

The Story of Building the App


After spending some time brainstorming and drawing up some basic ideas, I settled on the concept that the end user should experience the scene through the eyes of one of the detectives. They will find themselves in the flat looking towards the window of the room. I wanted them to have the freedom to visit locations throughout the room on their own terms, and interact with certain objects. Music and audio from the movie would add to the immersion and tie the whole thing together. Finally, I knew I wanted the experience to come to a dramatic finish, similar to the movie, which would be triggered by the user clicking to open the bathroom door.

Designing the room

I used Blender to design the 3D model of the room. I have had experience of creating 3D rooms in previous projects so was comfortable creating something that was true to the movie and had realistic dimensions.

I imported this into Unity after fully completing and unwrapping the 3D model. I then started to create my materials to finish the room. I asked the film makers for images they used to create the scene and they sent me some jpg, psd and png files such as the one below:

I used Photoshop to manipulate the images in various ways, to ensure they worked for my scene, and then I started to create materials. As I was building the application for a mobile device, I mainly used the “Mobile Diffuse” shader within Unity to ensure optimal performance.

Designing Objects

There are various points of interest in the room, some of which were easier to create than others. Assets such as the toilet and radio are open source copyright free 3D models which I found online, whereas the mattress and bath were created from scratch by myself. The most challenging prefabs to create were the severed head and the decapitated body in the bath, both of which were created in blender. To keep things brief, I will just discuss how I went about completing the severed head.

The film makers had provided me with an image of the severed head as a reference.

I started by purchasing a general 3D model of a females head and then started to sculpt this within blender.

I then used a displacement image of the head and started to layer this with textures from the original png file that I had been given. Once happy with the position of the texture over the image, I was able to create a new material in Unity and place this over the 3D mesh.

Finally, I had to create materials for the eyes, teeth and hair. The hair was a challenge and I created many different versions until finally settling on a combination of a blender generated mesh, merged with a plane with a transparent shader attached.

Setting the mood with lighting and audio

Baked Lighting

I used a variety of lighting techniques to set the mood. Nearly all the lighting is baked to ensure optimal performance on mobile devices. I used a faint directional light outside the room and angled this towards the window to create the effect that the moonlight was shining through and hitting the floor. The room was lit with point lights and spotlights that were angled towards key points of interest. This process took a lot of experimentation and iteration to get the mood just right, as I needed it to feel dark but yet light enough to see the room clearly.

Real-time lighting

I originally had two real-time light effects, but due to performance issues, I had to limit this to one. In the movie there is a flickering light effect that adds to the suspense of the scene. I wanted this to be in my application but knew I had to be careful with this, as flickering lights in VR can be uncomfortable to the user. I experimented with some C# code to gain the flickering effect and came up with the correct intensity of the light.

I had originally included a flashlight effect that moved around with the user’s gaze, as this was featured in the movie. However, I had to remove this, due to performance issues and the fact that it wasn’t compatible with mobile level shaders. I will be looking to include this in a high end version of the application for the Oculus Rift.


Using Unity’s built in audio listener technology, I created a 3D audio soundscape to complement the visuals. I used background audio taken from the movie and mixed this with an audio sample of a buzzing fly to give an eerie feel. Dialogue and music clips were placed at different waypoints throughout the room, which were coded to trigger when the user completed certain tasks. While creating the app, I had the idea to place a radio in the scene that, once clicked by the user, toggled between pieces of music that were featured in the film.

Navigation and Interactions

During this phase, most of my time was spent writing the code behind the game logic. The coding was in the C# programming language. I used a combination of scripts provided within the GVR SDK, scripts I had already written for previous projects, and new bespoke code that the scene required.

A fully immersive user experience was my goal, and I knew I had to get the waypoint system right for this to be achieved. This was another area where experimentation and iteration was key to its success.

After trying various ways to navigate the player around the room, I settled on a waypoint system coupled with the GVR reticle pointer within the Google cardboard SDK. I added a line of code to the GVR reticle pointer script, so as to hide the visual pointer until the player gazed at an interactive object. I did this, as I found having a pointer constantly in front of you effected the immersion.

I placed four waypoints around the room using an existing script, coupled with box colliders. I added an empty game object to the camera that I used, to keep track of the player and to trigger certain events. I created a prefab of the symbol that the serial killer in the movie uses as his calling card, and placed this on the ground at each of the waypoints. I then used code to ensure this only appeared when the user gazed at the specific waypoint, so it acted as a reference for the player to know where they are about to move to in the room.

· Waypoint 1 — the head — spooky audio is triggered once the player reaches here with dialogue from the movie

· Waypoint 2 — the radio — here the player can look out over the snowy Meridian city and trigger music from the film by clicking on the radio.

· Waypoint 3 — the mattress –if waypoint 1 & 2 have been visited then the bathroom door unlocks and opens ajar, accompanied by some suspense music from the movie to signal to the player that this should be their next destination. (if waypoint 1 & 2 have not yet been explored, then nothing happens)

· Waypoint 4 — the bathroom — the player can now open the bathroom door by clicking, and this triggers the end scene. Music from the same point in the film is played and the user is gently moved through the bathroom in a similar fashion to how the camera pans in the movie.

User Testing and Iterations

I constantly test and tweak my work as I go along but also believe in regular user testing. I think it is important to conduct user tests, as I may not be the best person to judge the quality and effectiveness of my creations. Please see below for some changes that I will be looking to make, based on the latest round of user testing:

· Navigation — most people found clicking the cardboard button to navigate around the room quite intuitive but some people who were very new to VR needed this explaining to them so I am considering putting a quick instruction screen while the app is loading.

· Interactions — because the user is free to move around the room as they please, there is a chance that they could move to the mattress area first. This area has nothing to interact with, and one of the film directors suggested a slide show of behind the scenes images could be projected onto the wall in that area. This sounds like a great idea to me.

· Audio — the bathroom door remains locked until all of the other waypoints have been explored. This is to prevent the user from finishing the scene too early. At the moment, a locking door sound effect is triggered when the user tries to enter the locked bathroom, but during user testing, it became apparent that this was not enough to prompt the player to look around the room more. One user thought the app had stopped working. With this in mind, we are going to record an audio sample of one of the actors stating that the door is locked and the user should explore more and come back later.


I got great pleasure from seeing the film makers try the application for the first time. They seemed to really enjoy the experience and one said “this is amazing, I feel like I’m inside my movie”. This project has given me the opportunity to apply knowledge that I have acquired over time, along with gaining new skills through problem solving and experimentation.

The application is now in the second phase of user testing and I am in the process of optimising the application, based on the feedback I have received. As the movie has a Christmas theme I am looking to release the mobile version of the app to the public in December 2017. I have also started work on the high end version for the Oculus Rift, which is really exciting.

Thank you for taking the time to read my article. 😃



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store