How We Built a VR Snow Globe
View it live at 2017.ronikdesign.com
At Ronik, experimenting with new technology and mediums to develop creative client gifts is an annual tradition. In 2016, we provided clients with custom Google Cardboard headsets. This year we gave them a reason to break them out again.
This project was built with and made possible by A-Frame, a WebVR framework by Mozilla. A-Frame makes it super easy to create VR experiences with code that looks a lot like HTML. If you can write HTML, you can start creating VR experiences.
Creating The Models
Using a combination of Blender, Cinema 4D, and Photoshop, we modeled and textured all of the objects in the scene. Each building began as a cube or series of cubes. Playing with scale and extrusion allowed us to create more intricate details. Once we had the shape of each building, we moved into textures which were painted by hand in Photoshop using color fills and brushes.
After creating all of the buildings and environment individually, we combined them into a single model to avoid having to position each building with code. However, we had to split out the elements that we wanted to animate, such as the clouds and satellite.
Importing Models and Performance
You can import many different 3D file types into A-Frame such as OBJ, JSON, Collada, and glTF. We found the least friction with the Collada format, so that’s what we went with. If we were to do it again, we’d use a more performant file type such as glTF.
When using WebVR frameworks, having a low polygon count is critical for achieving a consistent framerate. Our scene consists of about 50,000 faces, which is on the upper end of what’s recommended for today’s smartphones. Half way through our modeling process we had to go back and reduce detail because of these limitations.
Lighting is a key aspect to the aesthetic of a scene. For our snow globe, we used three lights. The sun and moon have tinted lights that are positioned inside their models, and there is an ambient light that acts as a neutral fill light for the rest of the scene.
To make the scene more dynamic, we wrote a component to rotate the lights based on what time it is. At noon the sun is at its highest, and at midnight the moon is at its highest.
Utilizing different perspectives is what makes the experience special, and that is done with cameras. There are two cameras: a wide angle orbital camera and a first-person camera positioned in the blimp. We switch to the first-person camera when the user clicks on the “Enter VR” button or “return” key on the keyboard and vice versa.
One of the benefits of developing with A-Frame is the many shareable components written by other developers that you can incorporate in your scene. Here’s a list of the ones we utilized to achieve some of the key elements and features:
- Along Path by protyze for creating the track that the blimp runs on.
- Orbit Controls by subsumo for the wide angle orbital camera.
- Particle System by IdeaSpace to create the snow.
We also created a few project-specific components to add functionality to the scene such as camera toggling, repositioning lights based on what time it is, audio controls, and handling the loading/intro screens.
Sound design is an often overlooked but critical aspect of every virtual reality experience. If it doesn’t sound like you’re there, you won’t feel like you’re there. A-Frame makes it easy to add positional audio into your scene, simply by attaching a sound emitter entity to your objects.
There are five looping audio tracks in the scene: a low rumble attached to the blimp, a sci-fi computer sound attached to the satellite, running water at the base of the river, generic city noise in the center cluster of buildings, and the ambient piano track positioned at the origin.
To create each sound, we modified existing clips from freesound.org. We shortened, distorted, and optimized the clips in Audacity.
When you create a project with A-Frame, there is no intro screen or loading indicator. To create a smoother experience for a relatively heavy website, we created a splash screen. This gave us the opportunity to prep the user for what they were about to see and reinforce the idea that there is a VR experience for users with a Google Cardboard.
For final polish, we integrated TakeShape to content manage the intro screen and key features of the scene. From TakeShape’s interface you can change the speed of the blimp, colors of the lights, camera FOV, snow particle count, and more.
The members of the A-Frame community continue to be helpful for newcomers to WebVR. If you’re interested in learning about A-Frame for your projects, the slack channel is a great place to start. You can generate an invite here.
You can download the models we’ve created for this project here. Feel free to use them for any project, just don’t resell them.