Virtual Reality Tips for Engineers and Developers
Virtual Reality is trending recently and there is a lot of info all over the web about it naturally. The articles are divided into two categories mostly scientific/engineering docs at Oculus, Unity, UnrealEngine, Samsung GearVR and Google Cardboard or developers/studious postmortems like it was hard and we found the way: How devs deal with 4 problem areas in VR game design, Making Great VR: Six Lessons Learned From I Expect You To Die or like it’s going to be cool when proper: Game UI Discoveries: What Players Want. Also I’ve found good practical book Learning Virtual Reality by Tony Parisi and I’m sure It must be more. But mostly it’s still theoretical info.
I was working on VR R&D projects for last several month. We’ve made demos for Oculus Rift on PC, Google Cardboard on iOS and Samsung GearVR on Android with Unity C# and Unreal Engine C++ for Oculus Rift and worked on kind of AAA game. The article is going to be pretty twisted in the best R&D traditions. I’ve parsed tons of info mentioned above, found a lot of practical useful info and tips in tutorials, answers, forums, blogs and so on and going to gather it here and share. Info is recollected from my work_done and to_read lists, so I guess nothing is going to be missed.
Google Cardboard (iOS)
First it was Cardboard with Unity, even after Oculus DK1 it’s kind of low tech. It’s port of Android SDK and in addition linker on iOS found the texture to fall on it. I was finding it iterative splitting by two because the error was unique and Goolge doesn’t find anything and anyway it’s pretty godt to jump in in hardcore manner:
- Make a VR Game With Unity and Google Cardboard — not too relevant but all clear and covering the whole process of making a small game for Cardboard that is limited with one magnetic button
- Unity’s UI System in VR — mixed theoretical but more practical article about what is Gaze input, how to use raycasts and canvases for UI in VR natural not n screen 3D space, how to integrate it with Oculus Utilities (OVR for Unity), a bit of diagetic and spatial interfaces basics and the first step to get Gaze Input with Reticle and OVR Player Controller working together
- User Interfaces for VR — also pretty mixed article, describes practical topics like text antialiasing and field of view, lots of theory in addition. It’s the first step to understanding that users are mostly get dizzy from nearly everything in VR
- Cardboard’s Unity Developer Guide — software developers guide from Google, totally practical naturally nothing to add
- Cardboard’s Unity Plugin Reference — on this step I’ve realized that Cardboard completely and the other things VR and much more software than hardware projects on this stage
- How to obtain a Glow Effect for iOS? — a lot of tasks in VR are the same to traditional PC or Mobile development. Here comes plain shader that highlight geometry on iOS, used for Gaze Input interaction implementation
- Find Unused Assets In Project — also general topic and pretty useful when assets and dynamically loaded or You’ve got the project from the other devs that are not too avail to ask. Practical howto is here Automatically locate all unused unity assets.
Oculus Rift (PC)
When Oculus DK2 appeared it became more easier to navigate in VR and naturally tasks became more complicated. We’ve used Unity and Unreal Engine both to compare:
- First Person Shooter C++ Tutorial — it might look surprising and if You want to take over proper control of Your camera in VR Preview in Unreal Engine You’d better start from here. There is no integration pack like in Unity it’s plugin for Oculus Rift or Gear VR or HTC Vive and it’s deeply integrated into engine. I wish I know it earlier like a lot of other tips here %)
- Oculus Rift Quick Start — Unreal Engine is very good not to say too much documented. Everything is nice described in details. It’s easier then in Unity as for me :)
- Matinee Basics: Creating Your First Matinee Sequence — You’ll want to show Your work to everybody and nowadays most of people don’t have headsets. The article teaches how to get the video from Your demo and oops Mantinee camera doesn’t support VR at least in UE 4.11 preview 7. For Unity Camera Path from Asset Store is very good analog but paid.
- Developer Oculus Downloads — SDK & Runtime & Engine Integration for Unity. I’d recommend review nearly everything there before go on with the project. Oculus Utilities for Unity 5 OVR Player Controller and OVR Camera Rig are the best on this stage. Maybe later on they are going to be integrated in Unity like Virtual Reality support check box appeared in Unity 5.3.
- Unity VR Samples — is completely useful to learn and I used Gaze/Reticle from there. In result from samples from Oculus and Unity I’ve mixed my custom OVRi library. Going to cleanup and share the source if have some time soon.
- How to record and stream DK2 footage — I’m skipping in between steps, just keep in mind that nearly everything is different about VR at least Oculus VR. Fraps doesn’t work with it but it’s gaming friendly so Twitch oriented Open Broadcaster Software works out
- XInput: how do I access vibration on 360 controller? — You’ll have to deal with Xbox Controller that is going to be included with Oculus Crescent Bay release hardware. XInput is Your friend and be aware of wrappers: Xbox 360 Controller Emulator (x360ce) is working but with one of a kind of controllers. Force feedback is experimental and goes as plugin on C++ in Unity 5.3. Classic Joystick looks like more suitable for controller testing.
- Obj/Temp folders in Unity project may cause FPS drop low then 75 FPS for Oculus DK2 just while testing or updating scene, try to just delete them if experience unexpected performance drop (looks like it’s safe)
- Getting Euler (Tait-Bryan) Angles from Quaternion representation — different hardware and software represent calculates 3D math up to it’s needs so You’ll have to get familiar with it
- Rendering Techniques — lot of limitations for VR caused by physics: Normal Mapping vs. Parallax Mapping for example or far and close objects can’t be focused at the same moment
- Razer Hydra is still the best default controller before Vive launch and if You not managed to get Oculus Touch or PS Move with Playstation VR like me. From the one side it’s old and official plugin for Unity is for Oculus DK1 and they don’t work with DK2 from the box, from the other side it’s very nice and kind of natural for gamer. We’ll see very soon how is it intended to work with Vive’s launch :)
- Simulator Sickness — the last but not the least about VR, a lot of people are affected with different types of sicknesses. My own observation says the more person is strict the more chances he/she gets dizzy problems. The developers who spends month in headsets can’t even imagine it, so You’d better watch out.
Samsung GearVR (Android)
Samsung GearVR is the released product not too much to research about:
- Samsung Gear VR Best Practices — keep in mind it’s powerful and still 32 bit mobile without huge wired power consuming GPU. It doesn’t have controller by default, only a touch pad aside that better than Cardboard’s and anyway
- Mobile VR Application Development — it’s the same Oculus SDK if You don’t use GearVRf and it has the same integration with Unity (not tested for Unreal Engine) so You can use Your code for Oculus and port apps/games pretty fast
- Samsung Gear VR — Detect Tap/Swipe — simple, easy and practical way to use Your OVR Controller or any other navigation/interface solution.
After all unsorted, scientific and kind of funny 20 Years old study on VR Randy Pausch’s study of Disney’s Aladdin Magic Carpet VR Adventure.
I hope it helps, saves to my colleagues engineers and developers some time and supports the innovations.
Let’s look on represented above tech like on first iPhone generation or even prototypes stage. Just unleash Your imagination can You tell what can be achieved with 2nd, 3rd or 7th generations of VR/AR/MR?
The original article @ LinkedIn