Room-scale VR using Unity3d

Dario Laverde
11 min readNov 30, 2015

--

Even if you’re new to Unity3d you can get started with a room scale VR experience or port an existing seated or standing VR or 3D application. Specifically, we’ll be targeting the HTC Vive Developer Edition kit using Valve’s SteamVR plugin for Unity3d.

Why room-scale VR?

Presence. This is better experienced than explained, but when you can walk around a room with an HMD that supports six degrees of freedom (from crawling on the floor to jumping in the air) along with sub-millimeter accurate controllers (from the ability to use a controller from behind your back to juggling objects in real time) you’ll have your users become immersed in your virtual world and yes, experience presence.

From the setup guide for the HTC Vive Developer Edition— credit: Valve

Note: room-scale VR does not require a dedicated room! The HTC Vive can work in a 4m x 3m space as easily as in a 2m x 2m space as well as in a seated or standing only space. If starting a new project try to approach room-scale VR as a best VR experience and then scale back to a seated experience as necessary.

But I don’t have a developer kit…

If you don’t have an HTC Vive developer kit yet (which includes a room-scale lighthouse tracking system designed by Valve), you can still start room scale development by using the Unity3d Editor’s game view in place of an HMD during development. For positionally tracked controllers substitute another device e.g. the Project Tango Developer Tablet from Google which provides positional tracking though not as accurately as the lighthouse system (see http://github.com/vmohan7/Navi). You can use the Tango Tablet as a positionally tracked HMD by using a third party cardboard-like tablet holder (http://durovis.com).

For VR developers already invested in other controllers, you could try mapping magnetic or camera based tracking solutions but your mileage may vary due to occlusion, FOV and latency. It’s very important to iterate design by testing as much as possible from the outset. Depending on your VR project it could be critical to obtain or borrow a developer kit or use the best equivalent while developing.

using the default HTC Vive controller model (color modified) in a world built using Unity store assets (terrain, skybox)

Technically you could combine game-controller/keyboard/mouse shortcuts to try to move your virtual Vive controllers in space, but try throwing something while giving it a spin in 3d space - it requires much more time and effort to emulate six degrees of freedom using shortcut keys. Once again I’d recommend using a substitute controller. At the very least move the controllers in 3d space with a mouse/touchpad.

Unity3d

Why Unity? It’s a popular choice for VR (as also seen with Oculus, Cardboard, Tango, etc.) but you could consider other third party engines/tools such as Unreal Engine 4: http://docs.unrealengine.com/latest/INT/Platforms/SteamVR

There will be additional 3d engines that will support SteamVR and the HTC Vive. I’ve mentioned Unity (C#) and Unreal (C++). For Java developers there’s now a version of the JMonkeyEngine that supports SteamVR: http://github.com/phr00t/jMonkeyVR

The SteamVR Unity plugin example scene

SteamVR/Scenes/example

If new to Unity, what you’re seeing above is the following:

top left — the current scene with 3d objects - in this case a sea of cubes from the example scene included with the SteamVR Unity plugin.

top middle pane — the current scene’s hierarchical view of components (objects can be nested).

bottom middle pane — project view folder and available assets, plugins. To open the example scene go to the root folder of the SteamVR plugin you imported and you should be able to click on the “Scenes/example” scene (note how scenes have a specific icon). Note: You can toggle a one or two column view inside this pane. The screenshots here use the one column view.

bottom left game pane — this is the game view used when testing locally in the editor (think main camera view) by using the play button (near top middle alongside pause button). Note that it depends on which camera we’ve enabled for running inside the editor as we can’t use the SteamVR camera there without an HMD at the same time as the default Main Camera.

Note: don’t edit while in Play mode unless that’s the intent — change the Play mode tint under Color settings so you’ll know when you’re in Play mode or you will lose edits as they aren’t saved during Play mode.

Lastly the right most window is the inspector which displays the properties for the currently selected object (selected in the current scene) where you can enable/disable specific object and edit available properties.

I recommend familiarizing yourself with the editor user interface here:
https://unity3d.com/learn/tutorials/modules/beginner/editor/interface-overview Followed by the basic steps to get a (non-VR) 3d game up and running: https://unity3d.com/learn/tutorials/projects/roll-ball-tutorial

If already familiar with Unity:

The SteamVR example scene above should help you get started quickly in your porting efforts. In the included guide you’re advised to add the SteamVR_Camera script to the existing Camera object(s) to which you want to add VR support. This reorders the “Main Camera” within the hierarchy in a clever way to position the origin t​o control where your tracking volume lines up (the fact that you can do reordering in a script opens up additional possibilities for coding towards the goal of a common code base — more on this later). From within the Inspector view you can toggle this feature with the “Expand” / “Collapse” button. When enabled it will default to “Expand” mode.

Toggling “Collapse”/”Expand” with the SteamVR_Camera script with an existing “Main Camera”

Let’s focus on starting a “New Unity Project” targeting a Windows build for testing with the Vive HMD (even if not developing on Windows).

Starting from empty with the SteamVR plugin imported (note how the editor UI layout can be modified)

The 4 steps to your first HTC Vive Room-Scale VR application

Step 1: import the SteamVR Plugin into your project

Go to Window->Asset Store (or http://assetstore.unity3d.com), search for “SteamVR plugin” and import it into your project as shown in the screenshot above. “Accept All” for the project settings.

Note: APIs are still subject to change, always check the latest SteamVR plugin release notes!

Step 2: add the Camera-Rig prefab to your scene

add a CameraRig to your scene and you’ll have the HTC Vive supported!

Drop the CameraRig SteamVR prefab into your new scene. I recommend that you add some 3d objects into the scene, e.g. a 4m x 4m floor plane assuming a similar play area space to match our scene. For now we’re ignoring the recommended practice of obtaining the play area dimensions and adjusting your scene’s size to it accordingly. Basically have something to walk around for your first room-scale experience.

Now technically you’ll have two cameras including the default one. To keep it simple, we’ll enable or disable the appropriate camera for testing without an HMD vs building for the HTC Vive which requires a Windows build target.

Step 3: disable “Main Camera” (if you’d like you can delete it)

enable/disable the MainCamera of interest
File-> Build Settings

Step 4: build settings and run
After setting the File->Build Settings shown above (including the current scene) you can now “Build And Run”:

At this point if you have a developer kit you’ll see the controllers in the game view. If you don’t have a kit you won’t see anything (we’ll fix that).

Note: We could determine the target at run time (generally enabling/disabling or reordering components at runtime). Additionally, since C# supports reflection we could reduce dependencies at run time and avoid having separate projects with separate assets for different targets. For now we’ll table that as the subject for a separate post and focus on a simple approach to enable in-editor testing + testing with an actual developer kit.

The Tracked Controllers

Back in the SteamVR sample scene we found a list of tracked devices grouped as “Tracked Devices” (see below). Previously this was included in the CameraRig prefab. The recommendation was to remove it as it displayed all tracked devices including the lighthouse base stations (it was only intended to be in included in the sample scene). Although useful for rendering tracked devices out of the box, it didn’t guarantee which Device# corresponded to a specific device. You needed to check the tracked device type to avoid displaying a model of the base station instead of your controller if you only relied on the Device# object name.

The SteamVR_TrackedObject.cs script handles the tracking with the StreamVR_RenderModel.cs rendering the correct model for the corresponding tracked object. If you stick to the prefab you’ll get both left and right controllers identified correctly for you. Take a look at CameraRig‘s Controller (left) or Controller (right) below and you’ll see these scripts.

How would you add your own model and where exactly is the default model for the controllers defined?

As it turns out there a several models available within your Steam client folder hierarchy: steamapps/common/SteamVR/resources/rendermodels

The current controller model to look at is vr_controller_05_wireless_b (but please note this is subject to change). If you only want the default controller model don’t do this as SteamVR will take care of rendering the latest and greatest for you. If you’d like to use this temporarily in your assets as a template to help you create your own controller models you can copy and drop this in your project’s assets to use it in your scene. To avoid rendering the models twice disable the original rendering script (also for not showing it at all if you’re replacing it with a your own e.g. “hand” model):

disable SteamVR_RenderModel (and any mesh rendering if you added any) if adding you own model

Now you can drag and drop the model of your choosing into the CameraRig‘s Controller (left and/or right) objects in your scene view and move the Controller (left in this case) into a position where you can see it with similar transform values for the right controller (with x position at 0.2)

This is to display a more visible starting point for the controllers — obviously these values are the tracked values. You’ll need to update these yourself if you don’t have the lighthouse system. For this specific model (if you’re using your own models you may adjust accordingly) you’ll need to rotate 180 around y as shown:

And at the very least change the color of your controller:

Again the default models are sort of hidden for a reason (and definitely subject to change — recently button animation support was added). Use these as a template for your own models. If you’re satisfied with the default model (which in some cases could make better sense) and only need to anchor additional objects then don’t disable the script or copy the default models into your project.

One last thing for those without a kit who don’t want the controllers to disappear when in Play mode you may notice there’s a runtime check for the controllers that disables them from being rendered. You may temporarily (since updates to the SteamVR plugin will overwrite this) work around this by editing the SteamVR_ControllerManager.cs script as follows (by adding to the left and right controller null checks in the OnEnable method) :

At this point the project’s code can remain the same for testing with or without a developer kit. But for testing in the Unity editor only, you’ll also need some scripting to be able to navigate (and move the controllers) in the game view — I’ll leave this as an exercise for moving the HMD and controllers via mouse/keyboard (hint: look at Standard Assets utilities or first character scripts).

Now for some room-scale VR best practices

RULE #1 — Don’t optimize afterwards, maintain 90fps at all times! In the editor you may max out at 60fps if not on a target PC like a Mac laptop so be sure to test on your intended target PC! This isn’t a room-scale specific rule but don’t drop frames is the main rule you should never forget even if you ignore the rest. Drop a frames/sec display in your scene early on in development.

Don’t move the first person character/camera:

That means no scripted (button triggered) head bob, stafing, jumping and no collisions (i.e. you don’t collide with objects, objects can collide with you) — basically no virtual “physics” on the first person character / main camera. This may sound odd given all existing non-VR 3d games and even when testing in Play mode within the Unity editor — but with a room-scale VR HMD you should allow the user to explore through walls, and even walk through objects in the scene (or move the objects out of the way instead). If possible try to match the scene’s borders to the play area size and let the room’s boundaries provide the game’s virtual boundaries for limiting user movement.

Moving the user breaks presence. There are exceptions of course (e.g. falling to one’s death to end a scene) but they should be done briefly and carefully with plenty of user testing.

Don’t transition the scene other than fading.

There are exceptions here too, but the emphasis is to minimize breaking with presence and especially to prevent reasons for simulation sickness. If you want to break this rule, make sure to play test it with many users.

There are a lot more VR design best practices, particularly with regard to simulation sickness. I’m just pointing out a few as we get our first room-scale experience up and running.

Be innovative with teleporting.

At some point (unless the experience is confined to the play area space) you may want to provide a way to move beyond the room’s borders.

Teleporting techniques can include existing pre room-scale VR techniques but bear in mind the previous rules such as not moving the player. For example use fade with teleportation transitions instead of picking the player up and flying them into the new position (unless you lift the room itself — like a magic carpet ride or a hovercraft vehicle).

Generally try to think of new ways to teleport (elevators, laser pointing, moving door ways etc. etc.). Some clever experiments such as moving the world ever so subtly (while avoiding sickness) to trick the player into walking in a circle while believing that they are walking in a straight line may fail as you may need a much greater than 5mx5m space to pull that one off — but do experiment!

SteamVR Unity “Extras”

In addition to the example scene in the SteamVR imported assets, you’ll find two additional scenes and additional useful scripts under the extras folder :

SteamVR_TestThrow
SteamVR_TestIK

The additional scripts like SteamVR_TrackedController.cs show button callbacks, while SteamVR_LaserPointer.cs, SteamVR_Teleporter.cs, and SteamVR_TestThrow.cs do what you’d expect them to with the latter used as part of the SteamVR_TestThrow scene. The SteamVR_TestIK scene illustrates Inverse Kinematics.

The SteamVR Unity plugin is free and already available to start exploring room-scale VR. You can get started developing for the HTC Vive now even if you are waiting to access a developer kit or waiting for general availability.

--

--

Dario Laverde

VR/mobile developer, community leader and director of developer relations at htc