This is blog was originally written in late 2016 but was never published — much has changed since then (we’ve been doing no GearVR development since) but I believe the content is still relevant and useful so I thought I’d publish it now and maybe it’ll help you out! The code is available here.
Making GearVR Development Less Painful With Vive
In the Applied Innovation team at Kainos we’re always looking for promising new tech to experiment with — things that other companies perhaps aren’t looking at or aren’t taking seriously. For a while now one of our main focus points has been non-games VR — it’s got amazing possibilities and it’s still early days. We’re still in the ‘wild west’ era of VR development; there are lots of mistakes to learn from and exciting projects being worked on and we don’t really know the best ways to approach many development challenges yet.
We’ve been experimenting quite a bit with the GearVR recently. Mobile VR solves a lot of issues that we have with using something like a Vive or Rift as a demoing platform; it’s light, easy to transport and doesn’t require a powerful PC. It’s inferior in many ways of course, one of the most notable being a lack of positional tracking; but that doesn’t matter when the alternative is carrying a desktop or chunky laptop around with you, not to mention the other kit needed for the desktop headsets as well.
Development for the GearVR is pretty straightforward if you’re familiar with Unity, however you’ll quickly run into a problem — testing changes inside the headset is a massive pain. Building and deploying to the Android device can take up to a few minutes and you then have to insert it into the GearVR which can be awkward (you get good at it quickly, but it’s still an additional step and I’ve sent my phone flying plenty of times). You can use developer mode for some things of course; but with VR I find you need to actually put the headset on to get a feel for most changes.
To me, this stood in stark contrast to developing for the HTC Vive, trying out changes for the Vive is just a matter of hitting the run button in Unity and sticking the headset to your face; it’s very quick and doesn’t interrupt your development at all. Another great aspect of Vive development is that you can move things around and tweak properties and settings while the application is running in Unity. This is really nice for VR where the user experience can be completely ruined if something doesn’t feel right — you can check lots of different positioning combinations and other variables while the app is in play, then save the best values and apply them. Being able to do this sort of thing reduces development time considerably and it’s something that you cannot do with the GearVR. But wouldn’t it be nice if you could?
We found, as it turns out, that you can; and it’s not all that difficult to do! When we were thinking about using the Vive for GearVR development, we came up with a few things we would need to do first;
- Restrict the Vive movement to GearVRs tracking capabilities.
- Come up with a way to quickly switch out Vive specific GameObjects for GearVR specific GameObjects.
- Emulate the GearVR input via the Vive controller.
Quickly switching platform specific GameObjects was easy to do, it’s just a matter of importing them all into Unity as normal then enabling the ones you need and disabling the ones you don’t — we made a script for this.
The Vive of course has full freedom of movement — it can track you as you move around a room, whereas the GearVR only tracks rotation in place, so we needed to restrict it. We did this with some simple maths — just displacing the Vive camera so that the player is always in the same position. Finally we used Microsoft’s mouse input functions (through this library) to simulate mouse presses as well as the mouse position when using the trigger and touchpad on the Vive controller, this was convenient because the GearVR touchpad is mapped directly to normal mouse left clicks and movement so we could just code all our inputs with the mouse as normal, no messing around with different inputs for the Vive.
If you think this sounds useful, the code we used is available on our Github - but you’ll need a few things set up first. If you’re not already developing for GearVR in Unity, I highly recommend following the setup guide on oculus.com, make sure you have the Android and Java SDK installed and located in Unity preferences. If you follow the setup guide you’ll end up with an OVR folder in your Unity assets folder, that’s all you need for the GearVR side of things but you’ll also need to get the SteamVR plugin from the Unity store. I’m not going to write an entire tutorial on Vive and GearVR development, but you do need to know the basics of both (and Unity obviously) so I recommend a read through the Oculus documentation. The SteamVR documentation isn’t great but it’s fairly easy to figure out the basics from looking at the prefabs and scripts included in the SteamVR plugin.
Now you have the API’s set up, you can download the Unity assets for the Vive to GearVR project (via Github). Unzip the folder and copy everything into the assets folder of your Unity project. In the prefabs folder you’ll see an object called ‘HMDPositionAndPlatformSwitch’, this is used to position the GearVR and Vive player cameras, and acts as a reference point to keep the Vive restricted to this position. It also contains the script to switch between the Vive and GearVR modes. Place this wherever you want in the scene.
You’ll want to have player objects for the GearVR and Vive as well, the SteamVR CameraRig prefab is perfect for use as a player object, for the GearVR you can simply use a plain old Unity camera — you could use the ‘OVRPlayerController’ prefab found in the OVR folder however remember that it has a collider attached by default which may cause issues (personally I found it was interfering with walls in a small scene, since I didn’t need the collider or any of the other functionality of the ‘OVRPlayerController’ I just used a basic camera instead).
Next, drag the ‘HMDPositionAndPlatformSwitch’ prefab into the scene. You’ll notice two arrays in the script attached to the ‘HMDPositionAndPlatformSwitch’ prefab, these are fairly self explanatory — just drag any Vive specific objects into the ‘Htc Vive Objects’ array and drag any GearVR specific objects into the ‘Gear Vr Objects’ array; makes sense, right? For Vive this will include the ‘[SteamVR]’ object and your Vive player object whereas I only have my player camera for the GearVR. Now you can switch the relevant objects on and off using the ‘Platform’ drop down in the ‘HMDPositionAndPlatformSwitch’ object, remember to switch this to Vive when running in Unity and GearVR when building for GearVR.
Now we’re almost done; just drag the ‘ViveMovementRestrictor’ script onto your Vive player object and drag the Camera object for your player onto the Player Camera slot in the ‘Vive Movement Restrictor’ script (this is Camera (eye) in the Vive CameraRig prefab if you’re using that for your Vive camera). Now drag the ‘HMDPositionAndPlatformSwitch’ object into the ‘Camera Position Obj’ field in ‘Vive Movement Restrictor’ script. This script does what it says on the tin — it prevents the Vive from moving, only allowing rotation similar to the GearVR.
Finally attach the ‘ViveToGearVRInputs’ script on the player object for the Vive and attach both Vive controllers to it. The ViveToGearVR inputs script uses the Vive controllers’ trigger to emulate a mouse click (left mouse button or mouse 0 in Unity) and the touchpad to control the position of the mouse. Note that when running the game in a window in Unity the mouse position tracking will be off, you can use the fullscreen option for better mouse tracking with the touchpad. You may also want to pay attention to where your mouse cursor is; if you click the play button in Unity and then click the trigger on the Vive controller (without moving the cursor) it’s going to stop the application…seems obvious but it might confuse you the first few times!
Developing apps and experiences for VR right now is an interesting challenge, and it’s one that requires a lot of trial and error before you start to get things right. It can be a pain to go through this process when developing for the GearVR with it’s build time and time spent trying to get your phone into the HMD, but hopefully our attempt at using the Vive as a quick testing platform will be helpful to those of you lucky enough to have a Vive available. Thanks for reading and if you have any questions don’t hesitate to drop us an email at rnd.developer@kainos.com.
