VR Development

Telling a story or creating an experience requires ideas and especially planning. Moreover, make it something tangible requires a lot of effort and user testing.

So, to start development, we need to set our target platform, and by doing this, we are also filtering our target client. Google Cardboard for VR is a challenging thing because it is the most tangible device for VR, but it is also the most diverse and, in some way, limited.

Planning our Platform

Once we choose a mobile platform, everything revolves around optimizations and scene baking. However, the basis of work and planning needs to be applied to diminish try and error and keep focus to experience objective.

VR Puzzler

We start with a simple idea, create an environment for a well know game. So, we decide to create a scene for the genius game. The initial concept was to put together a scene where you could only cross a border if you could solve the puzzle. So we idealized the Let Me In. You can find the video below it is possible to see how the game works and the final result.

VR Puzzle — “Gameplay”
ScreenShots using a LG G3

From the Ashes

Statement of Purpose: Let Me In is a mobile AR experience which demonstrates a new way to recreate and play “well know” games with a context.

Our Personas

Persona 01
Persona 02


The level design starts with a simple sketch to visualize what we need to build the level. Yes, my sketches are horrible, but they are enough for me to understand what I need to do (next time I will use Illustrator).

Basic Scene Concept

After understanding the sketch and what we need, we can search or create our assets and import it into Unity, and create the Prefabs (assets prepared to run into Unity). Now is time to build the level.

Building the Scene

Lighting and Baking

The level needs to create a mood, and we can use the light and sounds to create it. Moreover, using a mobile device, we need to bake our light to achieve a realistic look. Therefore, there some tricks to achieve a good result.

UI and Baked Scene

First User Test

To understand if the mood and look are good for the game, I made the first user test. The tested person was my wife. We use a spreadsheet to take notes.

First Test

So, she feels the right mood and scene, but she complains about the quality (“… it is not realistic …”). She was right, we didn't finish the lighting, but we noticed that we need to add more torches to create a brighter scene (not so bright, but not so dark). Also, with a darker scene we start facing some render artifacts. Unfortunately, we couldn't make a lot of questions (I don’t want to start a fight :) )

After analyzing the answers, we start a more complex light setup. It will bring more realism to the scene.

Second User Test

The second user test was planned to analyze the UI and if the user can understand what he needs to do. Again, my wife did the test.

Second Test

The answers were great, but we noticed that we need the audio to simplify the puzzle solution. Also, the sizes and speeds need a correction.

We decreased the UI size by 30%, and the walk speed by 20%. Also, we start the audio feedback system and the final game logic.

Third User Test

For the final test, my wife tried with the audio and the final scene light. The test was very good, and she said that the mood was more noticeable that the previous scene version.
We also add a success audio, because she didn't understand why she is going outside when she did the right sequence. She said that for her it is important to know when she succeed and when she failed.

We added a success audio feedback before start walking.

Audio and Easter Eggs

Audio is something that impacts in our environmental perceptions, so using it with spatial capabilities creates amazing immersion. We add some audios for feedback and ambient audio. Thus, we also add a fire torch spatial audio and an animated fire for each torch.

Spatial Audio Coverage and Easter Eggs

To explore the back view, we add an animated character we get from mixamo, and he is the guard. So if the player spends a time to look around, he will see the guard.

Breakdown of the final piece


The user starts 20 m from the border office door. He sees an UI menu with two options: Start and About. The first one led him to the game interaction position using a linear walk locomotion system. The second option shows a new UI menu with the about statement and a back button to the start UI.

Playing the Game

The user will see five blue balls, and they will start blinking in a certain order (randomly generated). Also, it has an audio feedback. The user must repeat the pattern by targeting the correct ball with the gaze. If he failed, an error sound would play, and the sequence will show again.

The End

If the user performs the correct order, a success sound plays, and the game starts again.


The project was very fun to do and was very important to do all the steps for building something with minimal documentation. It is extremely important to test with the real cardboard because of there are some problems that you can only see if you are using it. And it takes time. Also, it is not so simple to find what you need to change or optimize but is it the way we need to test VR, using the final piece, not only testing at the game view.

Next Steps

There are several extra elements that we can add to the project. Also, our user testing shows us some elements to explore. For example:

  • The player could talk a little bit with the guard, and we can add more interaction with him.
  • It will be cool if the gate starts to get low each time you fail, and if you fail a lot (3 times?), the scene fades to black and the game restarts.
  • It will be great to have a timer and a player rank panel.
  • Also, we could share it with Facebook

Additional Links

The game file for Android is available Here.

Also, the source code is available at https://github.com/rodrigomas/vr-puzzler