Hands on Samsung Odyssey+ with Maquette, Microsoft’s new rapid XR prototyping tool.

This is the XR screen technology you have been waiting for, and it’s a Samsung exclusive. At XRDC I tested the Samsung Odyssey+ and the Maquette XR software which you can get today if you join the beta program at https://www.maquette.ms/ I also compared the Odyssey+ to the Pimax 5k+

Article by Micah Blumberg

https://www.maquette.ms/

First of all I made sure to try the Samsung Odyssey+ twice, once yesterday, and once today. I really spent a long time in it today, playing with Maquette, in order to really absorb how it looked, I mean how the screen looked to my eyes, and I did my best to turn my thoughts about it into words while I was wearing the headset. I really tested it, and after coming away from that I am so impressed with this new HMD!

Samsung Odyssey+ HMD at the Microsoft booth at XRDC 2018

If you are like the majority of VR users with an original HTC Vive, the screen on this headset feels like a massive leap forward in terms of visual clarity. It is just a great screen. At first I could not even detect the screen door. It was so clear I could only see the graphics from Maquette.

Samsung Odyssey+ HMD at the Microsoft booth at XRDC 2018

Maquette is a great tool and I was pleased to learn that when you prototype in Maquette you can export the 3D meshes that you build to run in Unity, Unreal Engine, and even WebXR because they can be exported to the new GLTF 2.0 standard which is meant for WebXR which can be used to develop web based apps for both AR and VR.

Maquette is not an art tool, you are not shaping virtual clay or making colorful vistas like users might make with other VR software tools such as Tilt Brush, Quill, Masterpiece VR, Google Blocks, or Oculus Medium. Everything feels a little bit more concrete, when you open a menu it seems to snap in place, which is pleasing, its almost like your hand is getting frozen but the effect seems to be intentional because when a menu locks in place it feels much more readable and accessible.

I think this tool is accidentally perfect for creating webvr pages, because it’s low poly by default.

The same feeling extends to pulling primitives like cubes out of the menu system, they just snap into place, and pretty quickly you are building something usable that you can export into other programs.

Eventually after putting my head through text and blocks repeatedly over and over again I began to make out the very faint hint of a screen door effect. It was really hard to see, and I quickly lost sight of it over and over again because the graphics quickly dominate over the ever so minimal screen door effect which is exactly what a screen that is meant for XR should do.

Follow the tutorials on https://www.maquette.ms/ to learn how to group, auto align, and duplicate assets

Since we know that Oculus is not about to release an Oculus Rift 2, at least not anytime soon, because of Brendan Iribe’s sudden exit, the choices that users have for a next generation desktop mixed reality headset have narrowed somewhat.

The Pimax 5k+ (and the 8k) is exciting, I really like the wide FOV, and the increased Pixel density, but that headset also demands a lot more power from your GPU. It doesn’t even run on a GTX 1070. That’s a headset to get to go with the purchase of your GTX 2080Ti so your graphics card is not doing nothing new in XR while you wait for developers to figure out how to bring ray tracing to XR.

There are some heavy weight graphics & film professionals that I spoke with at XRDC that are highly skeptical that we can really bring ray tracing to XR anytime soon. The way XR is rendered does not fit in with the typical way that Ray Tracing is done. At Unite LA the CEO of Unity said further that he thinks Ray Tracing as a technology is at least 3.5 years away. Unity is and has been using a combination of Rasterization and Path tracing for Shadows, Subsurface Scattering and other HD effects in their High Def Render Pipeline but many professionals still draw a hard line of distinction between path tracing and pure ray tracing which is computationally a lot more expensive. The word is, that Unity’s engine would basically need to be re-written from scratch to support pure ray tracing, and that film makers will not use it currently as is. They (the leading professionals in film & tv) have been using Unreal Engine 4 for almost five years now to do things like the Nascar’s Augmented Reality television experience which is basically AR on TV that is enabled with Unreal Engine 4. Augmented Reality on television is easily making 100 times more money on television than it is in Augmented Reality headsets like Hololens.

The Weather Channel especially is capitalizing on using XR in their tv broadcasts and it looks amazing.

So the best use of RTX cards might be coupled with the Pimax 5k+/8k headset but that’s a huge purchase. It’s also not going to be portable. The Pimax uses the older outside in tracking like the original HTC Vive. It’s rock solid tracking, but it’s very hard to travel with, and time consuming to setup.

The Microsoft Mixed Reality HMD’s have always offered more value for users who want a portable XR headset that can work with a regular laptop, like Walmart’s new line of gaming laptops, and they have the newer inside out tracking that doesn’t require users to carry special cameras or lighthouse boxes that they need to set up play space with. It’s convenient, it’s portable, and it works in at least half a dozen places you can’t set up a Vive or Oculus in (like an airplane, a train, a car, a small table in the middle of a noisy room with lots of people etc…)

So when I saw the new Odyssey+ HMD was going to create the illusion of having about twice its resolution while still only requiring the GPU to render the same amount of pixels I let my imagination run wild wondering if this could compete with the Pimax 5K+ in terms of the visual clarity you get from 5K pixels. To be completely honest the Odyssey+ feels like the most clear screen I’ve ever tried. The Pimax 5k+/8k screens have a little bit of warping to the lens. Based on when I tried them less than a year ago. I’ve heard that Pimax has made a ton of visual upgrades since then, but I am still seeing reports from new users are slightly put off by the lens the first time they try it. Don’t get me wrong Pimax screens are beautiful and totally worth the money if you are a big spender, but the Odyssey+ is really great.

The Odyssey+ may be even the best screen for someone who wants something premium, high end, with great sound, great fit, that’s easy to clean with it’s removable headband, works great, does the job and of course is both portable and computationally more affordable considering the GPU in your existing laptop.

So I think for a lot of people, especially XR professionals, the Samsung Odyssey+ is going to be the right choice for a 2018 Mixed Reality headset. Especially if you want something more portable than the HTC Vive you probably have in your living room, those people especially are going to love Samsung’s new Anti-SDE (screen door effect reducing) technology.

On the other hand if you money to burn, you really want a 210 degree wide field of view, powered by the most expensive GPU, and you don’t need to travel at all with your XR headset then the Pimax is another great choice.

2018 is a great year for new XR technology. Samsung has really brought us what I think is the best XR screen to date, but now I sincerely hope that Samsung explores creating a headset that has a 210 FOV, because in the technology space at least competition really seems to drive innovation.

I am hoping we see great new screen technology from all the XR HMD makers going forward into the end of 2018 and into 2019.