Night at the Museum Project (Udacity VR Nanodegree)

Walkthrough of the Museum Experience


This post-mortem introduces Night at the Museum, the fifth project for the Udacity VR Developer Nanodegree. Night at the Museum is an interactive virtual reality (VR) museum showcasing 5 ways that VR is changing an existing industry. This project consists of 5 exhibits demonstrating how VR is disrupting Law Enforcement and Criminal Justice.

These exhibits range from dioramas, video, and interactive portions and are all accessible in rooms throughout the museum.

Special features in the project include a gaze-based open movement mechanic and an advanced scene management structure where the user is transported to an entirely separate scene for several exhibits. A trigger-based system for showing and hiding game objects and playing video has been implemented.

Experience on a mobile device.


Since this project revolved around presenting real information, the research phase took up a large majority of the planning and brainstorming phase where many topics were explored and covered.

A user-testing centered approach was chosen with a heightened focus on the communication of the information in the design of the exhibits. These user tests focused first on the educational aspects, followed by usability, and finally aesthetics.

The project took many iterations and in each iteration, the mechanics, artwork, and interactions were implemented then tested. A performance testing phase was added before the final art and lighting phase which is explained later.


Statement of Purpose

Night at the Museum is an immersive mobile VR museum teaching the user five ways VR is changing Law Enforcement and Criminal Justice.

User Persona

Justine, 34 — Lawyer

Justine is a recent hire to a law firm specializing in patent law. She has side interest in criminal justice. She’s new to virtual reality but interested in its possibilities. She wants a gentle introduction to the platform while learning more about how VR can be changing her work life.

Research and Sketching

Since this was an educational experience, a lot of time was spent in the research phase of the project. The role of VR in many different industries were considered including health care, film production, fitness, manufacturing, and retail.

Criminal justice was settled on due to a good mix of current solutions and future ones. It also seemed like the industry most affected across the different professions from police to lawyers and even prisoners.

Mood boards, color, and art style boards were created. A sad mood was decided upon due to the experience showcasing criminal justice and enforcement. A design decision of having the museum look realistic but the exhibits have a “cartoony” feel. This was decided since some of the content in the exhibits was violent. The hope that a less realistic rendering of this content would ease any discomfort for some users.

Due to timeline and modeling time restrictions, research was put into acquiring existing assets from the asset store that could be utilized instead of fully custom models.

Based on these decisions, sketches were created during the different design/development iterations.

One sketch to rule them all. Museum plan and exhibits.

This sketch shows the plan for the final version in the prison. Like all sketches, it resembles the final experience but differs in some key ways.

User Testing

Environment and Mood

Criminals in an art museum didn’t feel right.

In early iterations, a typical art museum style environment was built to house the exhibits. This was determined by user’s as confusing given the subject matter and themes.

Criminals in a prison make more sense.

A prison environment was chosen instead to match the theme and subject matter.


Since the user is transported to 3 different environments in addition to the museum, testing was performed to achieve a consistent human scale across all scenes.

The original scale makes the user feel 2 feet tall.
The final scale feels much more realistic.

Stencil Exhibit

The original plan for the interactive portion of the prisoner exhibit was to have the user put on a pair of “goggles” to see the outside world.

Users got confused by the “goggles” mechanic

These goggles could see through the walls. This was ultimately confusing for the users and didn’t match with the intent of the exhibit which was to “explore the outside world from within the cell.”

They preferred to be transported instead.

A mechanic where the user was transplanted to a new space was implemented instead.

Exhibit Cubes

While testing the user mechanics, it was discovered that a textual based button method for navigating between the museum and the exhibits distracted the user from the text.

Impatient users clicked the buttons right away.

They often clicked the button without reading the text or experiencing the videos.

The cube was clicked after reading.

In moving to interactive cubes, the extra cognitive load caused the users we tested to slow down enough to enjoy the written content or videos. This was a surprising finding because it meant adding cognitive barriers to the experience to lead to proper user flow.

Tuning the Cubes

User testing uncovered a few issues with the floating cubes. Having the same look for the same activity caused some confusion, so a blue and a yellow color were chosen for entering and exiting respectively.

Blue cubes enter the experience.
Yellow cubes exit.

Additionally, since the cube is now a 3D object, the user could get close enough to cause camera issues. Code was built to hide the cube when the user got too close.

Too close and the camera cuts it off.

Performance Tuning

Due to its realistic design, the scene had a lot of polygons and large textures. The addition of many exhibit elements into this scene led to an experience that was slow.

Mobile platforms saw performance as low as 5 frames per second which is not acceptable in VR. A lot of testing and analysis went into discovering how to get the framerate up to the 60+ fps that a comfortable VR experience requires.

Although many things were tried, two changes made the majority of the performance improvements that may be useful to other developers.

Splitting out the Scenes

Instead of keeping the exhibits and museum all within one scene and hiding or showing as necessary, we broke out the immersive exhibits into separate scenes. When we wanted to show the scene, we loaded the new scene on entrance and reloaded the museum on exit.

This led to a loss in progress in the experience as the user would always start from scratch, so Unity’s multiple scene loading functionality was used. A basic scene with the player and basic scene management was created, and it loaded sub scenes with Unity’s additive scene loading.

It led to some downstream lighting issues and some code refactors to find cross scene objects in code instead of passing them in as public variables, but it gave us a 20+fps boost.

Skinned Mesh Renderers

The mannequins used in the exhibit were designed to be animated characters. Because of this, they had an skeleton for animating controlling a skinned mesh. Unity uses an object called a SkinnedMeshRenderer.

Rendering these was incredibly slow on mobile devices. It appears that this was due to the CPU load SkinnedMeshRenderers need.

To get around this, the mannequins were kept in static poses, and the skinned meshes were baked to static meshes using this technique. This change saved us 30+fps.

Breakdown of the Final Piece

Museum Design

Tutorial hallway.
Exhibit hallway.

The user starts the experience in a museum that appears to be created in an old prison. The museum is old, deteriorated, and a bit claustrophobic.

The user looks at the floor and clicks to move.

The user can move using the headset to gaze and button at the ground. An indicator shows where the user will end up after clicking.

Command Center Exhibit

The command center exhibit exits in the hallway after the text explaining the movement mechanic.

The information for the first exhibit.

They are met with the first exhibit that explains the use of VR as a remote command center.

Inside the exhibit.
Other view.

Upon clicking the blue cube, they are taken to a different scene where a 3D presentation explains the idea behind the new technology.

Jail Exhibit

Learn about “escaping” jail from a jail cell.

The jail exhibit sits in a prison cell that appears to be in use.

Explaining VR use in prisons.

The exhibit has a poster explaining the technology and a Vice News video covering prison systems that are using this technology.

The user is presented with a blue cube that when clicked, helps them “escape” out of the jail cell into a laundry room in a house.

The cell becomes a laundry room.

They can click the yellow cube to return to the cell.

Training Exhibit

The training exhibit explains how immersive VR situations can help police train proper and judicious responses to experiences they may never see until they need it.

The training exhibit is in the next cell over which has been emptied out.

The cell contains a display and a video explaining how VR is used to train police.

There is also a poster with a quick explanation of the technology.

PTSD Exhibit

The PTSD exhibit explains how VR can be used to treat post-traumatic stress disorder in soldiers and police.

Like the Training Exhibit, the PTSD Exhibit is in an emptied jail cell. It, too, has a display and video explaining how VR can treat PTSD.

As well as a poster with a quick introduction.

Crime Scene Exhibit

The last exhibit is in the interrogation room across the hall. This exhibit covers how VR can help prosecutors and other attorneys communicate evidence to jurors and judges.

Instead of relying on flat images and reports, they can immerse the jurist into the environment, so they can get a complete picture of what happened.

This exhibit has a video explaining how VR is used in the courtroom.

And a blue cube that, when clicked, takes the user to experience a mock crime scene.

Experience the crime scene like a juror.

Conclusions and Further Work

The Night at the Museum project was definitely a great learning experience. The user-tested approach led to an improved product, and it was exciting to learn how VR is shaping existing industries.

The performance issues found during the development of the project were a great learning experience as well. Despite the incredible speeds on mobile devices, the performance gap between the device and desktop is still large. Though things may “work” across the platforms, often mobile is just not fast enough.

A lot of time and effort was spent on lighting the project. The baked meshes proved difficult due to light because of issues with their UV charts and normals. More time than expected was spent cleaning up these meshes than using them. Because of this, some of the exhibit interactions were not as exciting or well tuned as initially conceived.

Future work on this project would be to address this and build out more exhibits and enhance the existing ones. Animations and interaction strategies that don’t destroy frame rate should also be worked on to improve the experience. Some of the exhibits feel a little static and some movement in the scene would help to improve it.