Large-scale VR experiences are rare beasts. The main blocker for them is cables; you have to be tethered down. There are some solutions, perhaps most notably the Vive Wireless Adapter. Some companies have gone even further as to have all the necessary hardware built into the headset, like the Oculus Quest or the more recently released Vive Focus. However, wireless adaptors introduce latency, which can contribute to motion sickness and the all-in-one solutions can’t offer the same level of detail that full desktop systems can.
Recently, the Kainos Applied Innovation team collaborated with local comic artist P.J. Holden, known for being the illustrator for Judge Dredd, 2000 AD, and his work with Warhammer, to create an immersive VR experience as part of the NI Science Festival. After throwing a few ideas around, we landed on the idea of DragonSlumber; a VR adventure around a wizards tower, where you had to bring Lottie, the fairy companion, safely back home and avoid waking the dragons. My colleague Jake has written a blog describing it, and I’d encourage you to check it out. We had a large space to work with — roughly 15m x 12m — and wanted to maximise the use of the space to make it as immersive as possible. Also, we wanted to have 2 people going through the experience at a time, creating yet another challenge. Obviously, we couldn’t use traditional VR solutions to use all of this space, and the existing solutions to this problem weren’t going to cut it.
What are our choices?
In the team, we already have good experience with VR solutions, but this was a unique challenge that required some more in-depth research. We were already familiar with current tools like the Vive wireless adaptor and all-in-one headsets; neither of these seemed like ideal solutions but each presented their own advantages and drawbacks. While looking into this, we came across backpack PCs — which are pretty much what they say on the tin — and VR headsets with “inside-out tracking”. These have the advantage of not needing lighthouses, at the cost of tracking precision. So, all this left us with three options to choose from:
- Use a Vive with a wireless adaptor and use multiple lighthouses to cover as much of the area as possible, and accept a few blind spots.
- Use an all-in-one VR headset to save us having to bring PCs.
- Use backpack PCs and find a suitable VR Head-Mounted Display (HMD) with inside-out tracking.
The first option seemed like the least desirable one to take. Based on our experiences with it there was no guarantee that the wireless adaptor would give us the range that we needed, and we also couldn’t guarantee great coverage with the lighthouses.
The all-in-one VR headsets sounded like a good idea, however we would have had to take a performance hit and low FPS (Frames Per Second, a common performance metric) can cause motion sickness. Couple this with the fact that not every all-in-one can even provide 6 degrees of freedom (commonly abbreviated to 6DoF), so we quickly ruled it out.
That left us with the final option: backpack PCs and a HMD with inside-out tracking. Backpack PCs were powerful enough to drive the experience without issues and we had a good bit of experience in the team with them already, but came with their own set of drawbacks; most notably battery power, and inside-out HMDs can provide less accurate tracking but remove the need for lighthouses.
So, what did we do?
We used HP Z VR backpacks as our drivers for the experience. These came with everything that we needed pre-packaged, that being:
- The wearable backpack mount;
- A pair of batteries and a charging dock for them;
- A dock for the PC;
- The PC itself along with all necessary cables and power supply.
So all we needed to do was setup and then we were ready to rock. These PCs packed a punch as well; they have an Intel Core i7-7820HQ, 32GB of RAM and a NVIDIA Quadro P5200. They also have hot-swappable battery slots on the backpack mount to keep it going, which proved to be… interesting, to say the least. More on that later on.
Then we used Samsung Odysseys for our VR HMDs. These have an advantage over HMDs like the Vive or Oculus Rift as they don’t require lighthouses, i.e. all the tracking hardware is contained within the headset. This, combined with a portable backpack that isn’t tied down, makes for a VR experience that can make use of all the available space, and is only limited by battery longevity and charge times.
Pros and Cons
The clear advantage, as I mentioned, is that the experience is only limited by the available space. However, Windows Mixed Reality (or WMR) HMDs are limited to a roughly 3m x 3m boundary. Despite this, it allows you to go outside the boundary and still maintain 6 degrees-of-freedom (6DoF), so it is as if the boundary is as big as your space.
The main drawback of this solution is that, once you lose tracking outside the boundary, it can be hard to regain it depending on what your environment and lighting is like, and where within the environment you actually are. As well as this, there’s the problem of battery management; once one battery dies, the performance in the experience will drop and could cause motion sickness due to the FPS drop.
How did we solve these problems?
The short answer: a LOT of testing. The long answer is as follows:
As I mentioned, we had a roughly 15m x 15m space to work with — note that it wasn’t perfectly square either — and we spent a lot of time testing the tracking and what made it work best. The space wasn’t perfectly lit, so we placed objects (mostly chairs) in the darker spots, and made some random shapes out of cardboard and placed them in the space as well. We then put some cardboard sheets against the wall to help keep tracking regardless of where you were standing and looking in the space, which helped a lot. For the really dark spots (mainly the far corners), we used lightboxes to help illuminate the area a bit more so it was easier for the headset to track.
All of the above boiled down to how the tracking works on the Odyssey. Put simply, it uses cameras to scan the area and determine where it is. It’s going to be harder for it to determine this if it’s sitting in the middle of a big empty space with nothing distinct to anchor to. By adding the additional objects, we were able to create multiple unique shapes that allowed the HMD to recognise where it was, making the tracking as smooth as possible. There was a noticeable improvement in tracking consistency with these objects and the lights in place. Even when tracking did drop, a vast majority of the time you just had to stand still and look towards the boundaries and you were back in business.
Assault and Batteries
This was by far the biggest issue that we encountered throughout the day. Battery management is something that, if you’ll excuse the expression, we just about managed. Each battery set lasted about an hour and took around the same amount of time to charge up as it did to discharge. We didn’t have as many batteries as we would have liked (we wanted to have a backup set for each backpack we were running) and were scraping by at the end of the event with 10–20% charge, though that was partly down to human error. If we let the batteries completely run out, it would cause severe frame drops in the experience, and that is the beginning of a quick journey to motion sickness.
I believe that the heat generated from constant use all day was causing it to discharge faster and charge slower, though we would need more data to back up that theory. I think this is the one thing that is holding back this kind of technology, the batteries lasted for about an hour and a half at the start and gradually dropped off over the day. It took about an hour to charge them and this time gradually increased over the day; you can see where the problem starts to arise. All that being said, battery technology is still very much being worked on, so I would expect that we’ll see a solution to this in the not-too-distant future.
Overall, I think we made a good choice for our hardware; we were able to run the experience all day as we needed to and we got great feedback from everyone who went through. The kids loved being “armoured up” with their backpack armour and their VR “helmet”. Tech-savvy people acknowledged the challenge and the work that this required, and everyone else admired the realism and the immersion that they experienced. I don’t think this event would have been possible with traditional solutions; the area was too oddly shaped with hard-to-account-for occlusions for any lighthouse-based systems to work optimally. Furthermore, we had a lot of kids coming through the experience and we wanted to make it as accessible as possible. This meant simpler was better, and this was the simplest solution we could create for a complex problem.
I am a software engineer working in the Kainos Applied Innovation team on my first rotation as part of the Earn as you Learn scheme. We focus on new, interesting and up-and-coming technologies and find out how we can make use of them today. If you have any questions about this project, the team or anything at all, you can get in touch.