Reality bites for device manufacturers

The state of play in Augmented and Virtual Reality

Christopher Adam
7 min readJul 2, 2018

Before heading off to my first TechXLR8 in conjunction with London Tech Week, I took a look at the kinds of exhibitors that would be there. I was really happy to see there was going to be a dedicated area for augmented and virtual reality. I’ll be honest — I am a bit of a fanboy when it comes to these technologies. I love gaming and I am really interested in seeing how these technologies are used in non-gaming contexts, and in serious real-world applications.

TechXLR8 is not an event like CES or E3 so I was not really expecting too much from the hardware developers, such as the announcement of a new device. I was surprised to actually see some prototypes on display.

Vuzix, a company known for the development of industrial grade augmented reality glasses and eyewear, had their Blade prototype on display. This was a really nice looking device. It has a similar form factor as a pair of sports sunglasses (…with thicker frames). I was able to try on the Blade, and it was very comfortable to wear. The left frame arm is an actual touchpad, allowing you to make swiping and tap gestures to interact with applications. The glasses were handed to me with the movie app running.

There was a small clip from Star Wars (possibly Phantom Menace) playing. The image was bright but very small, so I could not really make the detail in the video. I really appreciate the technology involved in making a device this compact, but the demo shown did not show off / describe this device properly. I did ask the demonstrator about the specs and was given some very high-level details and some info on the SDK that was available. I think I really need to see a few more demos before I make a call on this device personally. As mentioned before the form factor is amazing, the waveguide technology used in the design of the Blade really is really cool, there is no big bulky projector that you can see beaming the image to the lens. My opinion so far is that this will be a decent device for industrial and possibly sports scenarios. I’m not sure if I would be comfortable wearing this device like a pair of glasses yet.

In terms of application, I get the feeling it will be better at handling 2D graphics than 3D, so decent enough to handle overlay style of augmented reality applications such as a video conference app (imagine a heads up style Apple Facetime app), or basic navigation or notifications apps.

Intel was also present at this event within the IoT section. The demo that they were presenting showed how a virtual reality application may be used for City Disaster Management. This demo involved several visitors taking on the role of a Disaster Centre Controller or Emergency Services. As a controller, you would be placed in the Virtual Centre with information about issues encountered from different sources (drones, street cams and body cams). The controller could then direct Emergency Services to the relevant areas providing that person with information from different viewpoints. I really liked the way that information was presented, the main information overview as a large virtual environment and the camera views as 360º video.

At the Intel stand the demos were being powered by several Intel NUC Skull Canyon edition Mini PCs. These units are around the size of an A5 notepad but were delivering a pretty decent virtual reality experience. In addition to this, the systems were tethered to Microsoft compatible mixed reality headsets. Although there was no MR content, I wanted to mention this since everybody chooses Rift or Vive for virtual reality, and forgets that there are other good headsets out there! One interesting point about these headsets is that the sensors are built into the actual headset itself, there are no other tracking sensors that need to be positioned around the user (unlike the Rift and Vive), with inside-out tracking is used to track the users head and body movement.

Overall this demo and the way it was executed was slick, I left the demonstration completely understanding what was being presented and the possibility it might be useful in that scenario.

Epson was also present at the event and had several devices from their Moverio range on display as well as a prototype, which I did not get to try on. The Moverio glasses are interesting in design, you can easily tell that they are some kind of augmented reality headset. Unlike the Vuzix Blade, the Epson Moverio glasses comprise of two parts, the eyewear and a small box which contains all the processing and power capabilities of the device.

It’s not as compact and contained as the Blade. The BT-350 model I tried on was actually quite comfortable, you can pull on the sides of the frames to expand them, these then grip to the side of your head. The box was left on the plinth, but you could attach it to your waist if required. The first thing I noticed was the display, as the device was already running an augmented reality app. The image being displayed was an overlaid live stream from the main eyewear camera. The quality of the image being projected was fantastic and occupied a fairly large letterbox area in the centre of your view, but the problem was, it was not stereoscopic!

This was really weird and it took me about 20 seconds to adjust. The demo was actually quite interesting, they had several 3D Printed objects, that when in view, popped into real-time 3D objects that animated. I did not get any information about how the demo was Developed. The Moverio glasses are supported by a proprietary Epson Android based SDK. The device clearly has a decent amount of processing power to allow it to perform 3D Object recognition and tracking.

Personally, I would have preferred the image being either displayed to the left or right eye (maybe there is an option for that), but the same image being rendered to both eyes is just distracting.

All of the devices were touted as augmented reality devices, with some of the demonstrators comparing their products to Microsoft’s Hololens. Personally, none of these devices matched up to the Hololens in terms of capability or display quality. Without getting into some kind of flame-war regarding Augmented Reality and Mixed Reality, the devices I have mentioned do not seem to have any spatial mapping/understanding functionality or capability (software or hardware).

Two companies Zerolight and Optis really caught my attention with high-end product visualisation using virtual reality. The level of detail (photo-realism) and interaction in these demos was impressive. Both companies utilised high-end virtual reality systems comprising either of an Oculus Rift or HTC Vive as their display units. Most of the demos were making really good use of ‘Room-Scale’, being able to physically walk around. At the Zerolight stand, it was funny to hear the comments that some of the visitors were making while walking around a virtual car and being able to interact with car components like the mirrors and the gear stick.

Leicester Fire and Rescue had a fairly large stand which they set up to resemble a disaster zone complete with a fake dead body. There were several demos taking place ranging from mobile-based 360 videos to a complete Room-Scale experience of an emergency where you walked around the environment wearing a backpack PC and HTC Vive.

I was really impressed by the amount of virtual reality demonstrations at this event, even if virtual reality was not the actual focus of that company or developer. I visited nearly all the stands that had some kind of virtual reality element to them, most of them were really interesting.

I enjoy watching 360º videos, especially watching 360º live streams of events or adventure experiences. I would really like to see more ‘stereoscopic’ 360º videos being shown at events.

After having a go at the Intel demo, that really got me thinking about multi-user experiences. The majority of the demos were a single user experience, often with the user was stuck in the middle of demo stand with a bunch of onlookers. That may be uncomfortable for a lot of people. I think more people would give the VR demos/experiences a go if they could jump in with another person and share that experience.

After seeing a couple of demonstrations that used mobile virtual reality headsets (the type where you slot in a mobile phone), personally, I find this type of virtual reality experience is poor in quality and uncomfortable. This year several standalone headsets have been released: these include the Oculus Go, HTC Vive Focus and recently the Lenovo Mirage. I really wanted to see some of these headsets in use, the last two of which supports head and body movement tracking… maybe next year.

--

--