<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:cc="http://cyber.law.harvard.edu/rss/creativeCommonsRssModule.html">
    <channel>
        <title><![CDATA[Telerik AR VR - Medium]]></title>
        <description><![CDATA[Stuff by developers from the Progress Telerik team, focused on AR VR with Unity3D - Medium]]></description>
        <link>https://medium.com/telerik-ar-vr?source=rss----cb8d50f0f390---4</link>
        
        <generator>Medium</generator>
        <lastBuildDate>Sun, 17 May 2026 10:15:13 GMT</lastBuildDate>
        <atom:link href="https://medium.com/feed/telerik-ar-vr" rel="self" type="application/rss+xml"/>
        <webMaster><![CDATA[yourfriends@medium.com]]></webMaster>
        <atom:link href="http://medium.superfeedr.com" rel="hub"/>
        <item>
            <title><![CDATA[The Mobile VR is dead, long live the Mobile VR]]></title>
            <link>https://medium.com/telerik-ar-vr/the-mobile-vr-is-dead-long-live-the-mobile-vr-486907ba445e?source=rss----cb8d50f0f390---4</link>
            <guid isPermaLink="false">https://medium.com/p/486907ba445e</guid>
            <category><![CDATA[talks]]></category>
            <category><![CDATA[virtual-reality]]></category>
            <category><![CDATA[oculus-quest]]></category>
            <category><![CDATA[conference]]></category>
            <dc:creator><![CDATA[Panayot Cankov]]></dc:creator>
            <pubDate>Wed, 16 Oct 2019 12:25:54 GMT</pubDate>
            <atom:updated>2019-10-16T12:25:54.057Z</atom:updated>
            <content:encoded><![CDATA[<p>For the last couple of months slot-in VR is going down. I’ve been asked on few different channels what happens with VR. Don’t worry. VR is not going to die. It just turns out the usability of slot-in mobile VR is not that great, but there is a clear mobile successor — Oculus. If you want to learn more about the development for the next generation mobile VR devices, I would like to invite you to my talk “VR QuickStart with Unity 3D” at DevReach Sofia 2019, 22 October.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*ygHll6u1QjJdH2yUSWdkCw.png" /></figure><h4>Google Discontinues Daydream VR</h4><p>That just happened yesterday. Google’s cardboard style products are what inspired my to pursue career as AR and VR developer. But it seems these products are now outliving their usefulness.</p><p><a href="https://www.theverge.com/2019/10/15/20915609/google-pixel-4-no-daydream-support-view-vr-headset-discontinued">Google is discontinuing the Daydream View VR headset, and the Pixel 4 won&#39;t support Daydream</a></p><h4>Samsung Galaxy Note10 will not support Gear VR headset</h4><p>Oculus and Samsung Gear VR had a partnership pushing forward the VR for some time. But Samsung somewhat silently discontinued investment in slot-in VR shortly after the products like Oculus GO and Oculus Quest were announced.</p><p><a href="https://mspoweruser.com/samsung-galaxy-note10-will-not-support-gear-vr-headset/">Samsung Galaxy Note10 will not support Gear VR headset - MSPoweruser</a></p><h4>Oculus Mobile</h4><p>Oculus now ship two devices: Oculus GO and Oculus Quest. They are mobile, their hardware spec is very close to that of a mobile phone. However they do not waste the battery of your phone. And are based on Android as well.</p><p><a href="https://www.oculus.com/?locale=en_US">Oculus</a></p><p>According to Mark Zuckerberg Quest sales go:</p><blockquote>“as fast as we can make them”</blockquote><p>Oculus also provides a VR dedicated app store and delivers high quality content.</p><h3>Mobile VR will be at DevReach 2019</h3><p>Do you want to know where the mobile VR is going? The slot-in VR is dead now, its successor is, in my humble opinion, superior. If you are interested to learn more about the mobile VR development for the Oculus family — I would like to invite you to my talk at DevReach 2019, Sofia:</p><p><a href="https://www.telerik.com/devreach/sessions/vr-quickstart-with-unity-3d">VR Quickstart with Unity 3D - DevReach 2019</a></p><p><strong>22 Oct. 4:40 PM</strong> — <strong>5:30 PM </strong>(hall Alpha) “VR QuickStart with Unity 3D”</p><p>For the past few months my colleagues, from the AR/VR team at Progress Software, and I have been working on line of business applications:</p><p><a href="https://www.vrlabs.tech/">VRLabs - Gain Competitive Advantage by Applying VR</a></p><p>And I am going to share part of our experience.</p><p>See you there,</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=486907ba445e" width="1" height="1" alt=""><hr><p><a href="https://medium.com/telerik-ar-vr/the-mobile-vr-is-dead-long-live-the-mobile-vr-486907ba445e">The Mobile VR is dead, long live the Mobile VR</a> was originally published in <a href="https://medium.com/telerik-ar-vr">Telerik AR VR</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Discovering VR for Business — an INSPIRED update from VRLabs]]></title>
            <link>https://medium.com/telerik-ar-vr/discovering-vr-for-business-an-inspired-update-from-vrlabs-896aea911be8?source=rss----cb8d50f0f390---4</link>
            <guid isPermaLink="false">https://medium.com/p/896aea911be8</guid>
            <category><![CDATA[immersive]]></category>
            <category><![CDATA[unity]]></category>
            <category><![CDATA[vr]]></category>
            <category><![CDATA[immersive-technology]]></category>
            <category><![CDATA[virtual-reality]]></category>
            <dc:creator><![CDATA[Georgi Atanasov]]></dc:creator>
            <pubDate>Wed, 25 Sep 2019 14:47:16 GMT</pubDate>
            <atom:updated>2019-09-25T15:10:06.534Z</atom:updated>
            <content:encoded><![CDATA[<h3>Discovering VR for Business — an INSPIRED update from VRLabs</h3><p>It’s been nearly 2 years since we started our AR &amp; VR journey at Progress Telerik. For that period we created some awesome stuff, made some mistakes and also learned a lot about the ecosystem and how to operate in an early market.</p><h3>We’ve Been Inspired</h3><p>End of May this year, we had a training. A training that turned our point of view and thinking at 180 degrees. It was based on the “<a href="https://www.amazon.com/INSPIRED-Create-Tech-Products-Customers-ebook/dp/B077NRB36N">INSPIRED — How to Create Tech Products Customers Love</a>” book by Marty Cagan.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/918/1*fS7pXru7XKsuxSZ-JmDf0g.jpeg" /></figure><p>Since then we re-assessed our approach to VR and redefined how to further develop our endeavor.</p><p>The most important takeaways from this book for us are:</p><ul><li>Nothing important happens in the office</li><li>Our opinions, although interesting, are irrelevant</li></ul><p>In other words — if we want to create a valuable product for VR, we must find real clients and talk to them, BEFORE we create a product. This is what the book calls “Product Discovery”. Of course, the process of product discovery is built upon multiple steps and risk assessment, but in essence, there is little to no chance for you to create a valuable product unless you validate it with customers prior to any real development.</p><h3>A New Website is Born</h3><p>Following the best practices of the Inspired methodology, we are thrilled to <a href="http://www.vrlabs.tech">announce a new website</a>, entirely dedicated to VR, that we believe will enable us to have more streamlined conversations with people about how VR can add value to different business processes.</p><p>Here is our new web portal:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*iwpCCoyMFTmsKYQEd_hx1g.jpeg" /><figcaption>The home page of <a href="http://www.vrlabs.tech">www.vrlabs.tech</a></figcaption></figure><p>If you’ve been planning to start exploring VR for business, you can start by exploring our new site. One of the top reasons that make VR ideal for business, is that this is an early market and still an uncharted territory — the perfect setup for future-looking companies to create competitive advantage and innovate on existing processes.</p><h3>Why a New Website?</h3><p>To better streamline the process of discovering new prospects, we needed a focused landing page. While our existing page on <a href="http://www.telerik.com/ar-vr-lab">www.telerik.com</a> is excellent, it brings some distraction to people new to the Telerik brand, and who come solely for VR. As a rule of thumb, when you explore new products in a new ecosystem, you should not force people to learn more about your existing brand than needed.</p><h3>What Happens With Our Plans for DataViz and Developer Tools?</h3><p>Our efforts in this direction remain unchanged. While building real VR simulations for various businesses, we will generate a set of reusable components that are fundamental for most of the applications. As an example, some such, already existing, components are:</p><ul><li>Networking (multiplayer) with VoIP, avatars, gestures and 3D models import/collaboration</li><li>Drag-and-drop laser pointers, 6-DoF hand gestures and 3D object interaction</li><li>DataViz components like Charts and Graphs</li><li>Hand-based interactive menus</li><li>More…</li></ul><p>If you are building something for VR and a set of developer tools would prove handy, please drop us a line at <a href="https://www.telerik.com/ar-vr-lab#contact-us">https://www.telerik.com/ar-vr-lab#contact-us</a>.</p><h3>Interested to Talk VR?</h3><p>We want to hear from you! We have ton of ideas and many PoC applications about how VR can add value to businesses, but it is your requirements that matter on what a VR application needs to do. So let’s partner:</p><p><a href="https://www.vrlabs.tech/contact">Request a Demo | VRLabs - Competitive Advantage Through VR</a></p><p>Or, you can reach out on our new social profiles:</p><figure><a href="https://www.facebook.com/vrlabs.tech/"><img alt="" src="https://cdn-images-1.medium.com/max/256/1*EsUvWqe8BjyJO-AKi1_nIg.png" /></a></figure><figure><a href="https://twitter.com/VRLabs1"><img alt="" src="https://cdn-images-1.medium.com/max/256/1*f8_VdjxPCVGtjR1gwcDemw.png" /></a></figure><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=896aea911be8" width="1" height="1" alt=""><hr><p><a href="https://medium.com/telerik-ar-vr/discovering-vr-for-business-an-inspired-update-from-vrlabs-896aea911be8">Discovering VR for Business — an INSPIRED update from VRLabs</a> was originally published in <a href="https://medium.com/telerik-ar-vr">Telerik AR VR</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Is Oculus Quest Really The Best All-In-One VR Device Today?]]></title>
            <link>https://medium.com/telerik-ar-vr/is-oculus-quest-really-the-best-all-in-one-vr-device-today-613e346dbc13?source=rss----cb8d50f0f390---4</link>
            <guid isPermaLink="false">https://medium.com/p/613e346dbc13</guid>
            <category><![CDATA[oculus]]></category>
            <category><![CDATA[oculus-quest]]></category>
            <category><![CDATA[unity]]></category>
            <category><![CDATA[virtual-reality]]></category>
            <dc:creator><![CDATA[Georgi Atanasov]]></dc:creator>
            <pubDate>Thu, 20 Jun 2019 12:07:28 GMT</pubDate>
            <atom:updated>2019-06-20T12:07:27.895Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*qMyUjRn_a3ZNJJqlNPCfqg.jpeg" /><figcaption><a href="https://www.telerik.com/ar-vr-lab">VRLabs’</a> first Quest</figcaption></figure><p>Short answer is “Yes, definitely”. In this post I will share my personal opinion why I think so. There are a lot of great in-depth reviews of <a href="https://www.oculus.com/quest/">Oculus Quest</a> and my intention is not to write yet another review, but to rather summarize all the details that delighted me and made me smile.</p><blockquote>Disclaimer: I am a huge VR fan, part of the <a href="https://www.telerik.com/ar-vr-lab">VRLabs team at Progress</a>, and there is certainly some bias, so if you are new to VR don’t rely solely on my opinion. Still, I hope that you will find these points valid when you enter this new world :)</blockquote><h3>#1: Premium package quality</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*-zgChXQH90uI-t-_tssNOw.jpeg" /></figure><p>This device is premium from every single aspect. The entire package implies the perception that Facebook put a lot of thought process and attention not only to the hardware, but also to little details like magnetic close on the internal box that contains the charging cable. Or the included batteries for the controllers. And for those of us that love to stay in VR for more than an hour, there is this 3 meters-long charging cable that allows you the freedom you need.</p><h3>#2: Craftsmanship from the outside</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/800/1*V9309TRbJMsXwW-1LClecQ.png" /></figure><p>Oculus Go, although a really nice entry-level device, felt cheap to me. And it can’t be otherwise, for the price tag of $200. The Quest, however, is rather premium. It is easily adjustable in size and fits perfectly on the head. Although not the most ergonomic device that I’ve tried, it is ergonomic enough to let you stay comfortably in VR for hours, especially if you are watching movies/videos.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*lw7LauHKphGIIS662guftw.jpeg" /><figcaption>The soft rubber-like coating is extremely comfortable</figcaption></figure><p>The controllers are just awesome. Did you know of this rubber coating on the bottom half that makes the grip extremely comfortable? To me, these are simply better than the original Oculus Rift controllers.</p><h3>#3: Outstanding lenses and display</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*swFuzWwgsi0HHLcaDLwOhA.jpeg" /><figcaption>1,600 x 1,440 pixels per eye OLED display</figcaption></figure><p>The Quest is not the VR device with the best picture quality. But for an All-In-One device, the quality is mind-boggling. For example, it is better than the first Rift, which is a tethered device that uses the graphics capabilities of a powerful gaming PC. Our applications, by just porting them to Quest, felt times more crispy and vivid.</p><h3>#4: Unexpectedly good tracking &amp; guardian system</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*FZAeRN2K0SclJszJG_Vf-A.jpeg" /><figcaption><a href="https://twitter.com/kamen_velikov">Kamen</a>, a colleague of ours, trying the Quest for the first time</figcaption></figure><p>I was excited by the Quest the moment Facebook announced it. But, honestly, I wasn’t expecting its tracking and guardian systems to be <em>great</em>. And, besides working flawlessly (well, on rare occasions, when you get too immersed in the game, there are some minor flaws but they are indeed easily accepted as insignificant), it is also a system that you get used to in a matter of several minutes.</p><h3>#5: As easy as possible app migration</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*PIqoVvIeJTuBwRYN1QfNSg.jpeg" /><figcaption><a href="https://medium.com/u/e47fdddbfaaf">Deyan Yosifov</a> &amp; <a href="https://medium.com/u/53bcfe157cfe">Panayot Cankov</a> in multiplayer, porting from Go to Quest</figcaption></figure><p>If you’ve built apps for the Rift and Go, then chances are that you will be able to easily migrate them to the Quest.</p><blockquote>Our team focuses on the business aspect of VR and we do not have experience with games, so I can talk only about line of business applications.</blockquote><p>In fact, the biggest difference between the Go and the Quest lies in how an app handles user input and what level of interaction it allows for. We spent like a week to develop a cross-platform Pointer and Interaction abstractions, based on the latest Unity3D input device APIs, that just work on every Oculus device. Then, building and deploying was like just unplugging the Go and plugging-in the Quest.</p><h3>Conclusion</h3><p>To me, Oculus Quest is well prepared to steadily democratize the consumer VR space. It has everything you can wish for from an all-in-one device and delivers even more than expected. And most importantly — it is at an affordable price. $400 is a really good price tag for the quality that we get. And, as we all know, the mass adoption of a technology starts with the consumer market.</p><p>So, no matter if you are new to VR or you’ve been a fan of another device, I can definitely recommend the Quest — you won’t regret it!</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=613e346dbc13" width="1" height="1" alt=""><hr><p><a href="https://medium.com/telerik-ar-vr/is-oculus-quest-really-the-best-all-in-one-vr-device-today-613e346dbc13">Is Oculus Quest Really The Best All-In-One VR Device Today?</a> was originally published in <a href="https://medium.com/telerik-ar-vr">Telerik AR VR</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Perception of Stereoscopic DataViz in AR and VR]]></title>
            <link>https://medium.com/telerik-ar-vr/perception-of-stereoscopic-dataviz-in-ar-and-vr-b1cf906d6831?source=rss----cb8d50f0f390---4</link>
            <guid isPermaLink="false">https://medium.com/p/b1cf906d6831</guid>
            <category><![CDATA[virtual-reality]]></category>
            <category><![CDATA[dataviz]]></category>
            <category><![CDATA[data-visualization]]></category>
            <category><![CDATA[augmented-reality]]></category>
            <dc:creator><![CDATA[Panayot Cankov]]></dc:creator>
            <pubDate>Thu, 13 Jun 2019 15:26:53 GMT</pubDate>
            <atom:updated>2019-06-13T15:26:53.187Z</atom:updated>
            <content:encoded><![CDATA[<p>My team has been working on AR VR DataViz for a while and I’ve built some impressions and thoughts I wanted to share with you on depth perception in VR and its usage in DataViz.</p><p><a href="https://www.telerik.com/ar-vr-lab">Explore and Analyze Your Data in Stereoscopic 3D with AR and VR - Progress Telerik</a></p><p>Vision for each eye span about 120 degree horizontally and 100 degree vertically. If we take into account, the eye rotations this spans the view to 210 degree horizontally. If in average the smallest visual offset between the visual features that can be detected is about 0.0006 degrees, this gives us a resolution of more than 350,000 pixels horizontally necessary for each eye to match what we can potentially see in reality.</p><p>The VR Book explains this, and a lot of other things about the VR in depth. I do highly recommend it:</p><p><a href="http://www.thevrbook.net/">The VR Book</a></p><p>Today consumer grade VR devices are getting shipped by millions. Sony sold over 4 millions PSVR headsets, that makes 1 in 20 PlayStation 4 buyers. Oculus Quest standalone units are shipped rapidly, with overall high user satisfaction.</p><p>But all these consumer grade devices are still VR generation one. They have nice controllers, ergonomic headsets, good positional tracking, easy setup and maintenance. But when you take a step back and take a look at the technology the innovation is not revolutionary, it is evolutionary — the hardware is little more than lenses attached to a phone screen in a single package mounted on your head. The hand input is done through infrared camera tracked controllers, but finger input is captured through buttons so fine motor skills has little play. Feedback is provided by haptics — controller vibrations. Sound is decent quality. There is no body tracking. There is no skin touch, heat, smell, etc.</p><h3>How do we See in VR?</h3><p>The human vision relies on two eyes. We receive two slightly different images. Out brain combines the state of our eye muscles eyes, our head position and those two images, to form a single perception of a world rich in depth.</p><h4>Visual Acuity</h4><p>Each of our eyes contains several types of sensors. We will not go into details into these types, but we will note their distribution. The visual acuity, the ability to detect clearly and in high-resolution, is concentrated in the fovea — the center of the retina.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/478/1*iIACaMHJ4gEheCJGXyeDxQ.png" /></figure><p>So those 350,000 pixels we can potentially detect — we do not receive all that information at once, the fovea sees only the central 2 degrees of the visual field in high resolution.</p><h4>Saccades</h4><p>To perceive more detail of a 2D or 3D picture our eyes move simultaneously to bring different details onto the fovea — Saccades. The brain does these in succession and extrapolates the fine details in missing pieces.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/364/1*o90vTdnO2B4nFyiwwlS1pw.png" /></figure><p>If you could track these movements, you could generate an attention map. But the most important thing for the consumer grade VR headsets is that they simply cannot deliver the resolution to fill completely the fovea.</p><p>To present high quality models, high quality textures, and the perception of the detail of a real-world premium object, the VR display will have to match the fovea resolution. This can be done by eye tracking, streaming a very small but high resolution image targeting the fovea. But the VR is not there yet.</p><p>Consumer VR may not be the right choice to present premium, high detail, products, pictures, textures, tiles, fine art etc. the headsets cannot deliver the level of detail.</p><h4>Horopter Plane</h4><p>We see two images, one with each eye. Objects at a distance, that has no disparity in those two images, form a plane — the horopter plane. Those objects are perceived as crystal clear. Near that plane is an area where objects generate disparity, but the brain can still fuse them together. Everything else outside that area, contains disparity too great to fuse, causing objects to appear as doubled.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/355/1*Hd1kLn3l72TT9N7uDAC9Lg.png" /></figure><h4>Vergence</h4><p>Vergence is the rotation of the eyes in opposite direction with the purpose to obtain sharp and comfortable vision. The position of the eyes in this type of movement provides depth cues effectively of up to 2 meters.</p><h4>Accommodation</h4><p>Our eyes contain lenses that change trying to focus the incoming images on the retina. That process is called accommodation. The feel we have of the eye muscles contraction that change the lens, provides a cue for distance of up to 2 meters.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/315/1*Tzk-dCkJUqVXZDKhMtUtHA.png" /></figure><p>When the effects of vergence and accommodation are combined they make objects that are on focus to appear as single clear image. Objects that are not on focus appear as blurry doubled objects.</p><p>The consumer headsets have screens and lenses that make the images focus at a fixed distance. But present two different images to both eyes. This causes vergence vs accommodation conflict. The consumer headsets benefit from that the vergence contributes to depth perception more than the accommodation.</p><p>This enables the consumer VR headsets to deliver information rich in depth suitable for engineering visualizations, buildings, machinery, floorplans, charts.</p><h4>Depth Perception</h4><p>Accommodation and vergence play role in depth perception. But the overall process is much more complicated. Our brains process automatically array of stimuli and mix them with our knowledge and prior experience. This allows us to perceive depth from 2D media or even from a single eye.</p><p>Consider the following list of factors contributing to depth perception again from “The VR Book”:</p><p><strong>Pictorial Depth Cues</strong></p><ul><li>Occlusion (front objects hide rear objects)</li><li>Linear perspective</li><li>Relative/familiar size</li><li>Shadows/shading</li><li>Texture gradient</li><li>Height relative to horizon</li><li>Aerial perspective (rear objects look dull disappearing in fog)</li></ul><p><strong>Motion Depth Cues</strong></p><ul><li>Motion parallax (when you move, near objects seem to move faster)</li><li>Kinetic depth effect (changes caused by moving objects)</li></ul><p><strong>Binocular Depth Cues</strong></p><p><strong>Oculomotor Depth Cues</strong></p><ul><li>Vergence</li><li>Accommodation</li></ul><p><strong>Contextual Distance Factors</strong></p><ul><li>Intended action</li><li>Fear</li></ul><p>This raises the bar of designing 3D stereoscopic display experiences that rely on depth perception. You must take into account the oculomotor depth cues thanks to the stereoscopic displays as well as the motion depth cues thanks to the head tracking.</p><h3>2D vs Stereoscopic 3D Displays for VR DataViz</h3><p>Traditional data visualization techniques present data on a single 2D screen. This media has the property to fit all the presented information within focus satisfying the vergence and accommodation easily. That is the whole 2D presentation lies very closely to the horopter plane.</p><p>There is a trend for curved displays and TVs, that positions even the edges of the screen perfectly on the horopter plane. In VR curved displays are used frequently for visualizing 2D movies, sometimes menu systems.</p><p>On the other hand, the human brain can only consume a limited amount of input at a time. This leads to a conflict where bigger screens and better resolutions allow for increased amount of information to be displayed at once, but brains are not capable of handling it all. On 2D you can rely only on the visual acuity distribution to control content at focus, leading to grouping of data, guidelines for content paddings, etc. This leads to a strive for an overall simplicity in 2D data visualization.</p><h4>Don’t go 3D on 2D Media</h4><p>This is as a rule of thumb. Any old book on DataViz will talk about avoiding 3D in general. It leads to noise, high ink per data ratio, visual clutter.</p><p>Perception of 3D data on 2D display cannot generate the accommodation and vergence depth cues.</p><p>On printed media, where the models cannot be moved, occlusion hides rear data and there are no cues from motion parallax. This can be overcome by interactive 2D displays but it still requires active input from the user or position animations to change the point of view in order to read the data.</p><p>Shadow/shading, areal perspective requires photorealistic rendering, for DataViz on 2D displays these type of visualization often lack the context of the environment — where lights come from, room corners forming cues of linear perspective and so on.</p><h4>Go 3D on 3D Media</h4><p>Stereoscopic 3D displays are very different. Rich 3D models of the data can be presented. Moving your gaze through these models will push and pull that horopter plane, moving data sets in and out of focus. Depth cues from vergence are provided thanks to the stereoscopic display. Head tracking will allow you to naturally change your point of view triggering depth perception cues that come from occlusion and motion parallax. Occluded back objects can be easily and naturally revealed. Using proper geometry can also trigger depth cues from linear perspective and relative/familiar size.</p><p>The consumer grade VR headsets can generate exceptional depth perception in users, that enables a 3rd dimension to be used, and the perception of 2D areas can be extended to 3D volumes.</p><p>So how can we make a good use of this property in AR VR DataViz?</p><h4>Sales Map</h4><p>One thing that comes to mind is to use clusters of data placed at various distances. Having a stereoscopic AR VR DataViz horizontally placed map, with clusters of charts positioned by regions, can effectively replace a drill-down by regions in a traditional 2D story. Instead of checking out sales by navigating back and drilling down to a different region, you just use the most natural interaction possible — slide your gaze.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*IZXV76h9Sc6hF4A50zJByg.jpeg" /><figcaption>GeoSpatial VR DataViz</figcaption></figure><p>Yes, you can visualize a map on a 2D display, but if you overlay the charts, then they all get perfectly focused and you get quickly that too-much-information effect. You can rely only on visual acuity distribution to separate groups.</p><p>The shading of the map provides height information both in 2D and AR VR DataViz, but in stereoscopic 3D depth cues are also triggered from vergence on high contrast areas — road lines, city names, pins.</p><p>The 3D Bars we selected are aligned with the table sides and the floor tiles so depth cues come from linear perspective and the familiarity with the bar sizes.</p><h4>Combine Reports and Share Context</h4><p>Combine several 2D reports in a single stereoscopic 3D AR VR DataViz experience. The closest thing to this in 2D is having 2 monitors running a single reporting app — multiscreen app that should be aware of the monitors placement.</p><p>In our SalesDashboard application we placed the products in one axis and shared it between multiple 2D reports placed amphitheatrically. Each report focuses on different aspect of the data and you can switch the context by moving your gaze, while mapping the products from one report to the other. The navigation feels much more natural than switching tabs in a browser.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*isJEKqgDQ3iWjXvziH88kw.png" /><figcaption>Panorama view of our amphitheatrically placed 4 reports</figcaption></figure><p>Keep in mind the above image is a projection of the VR experience onto 2D media. When experienced on device, the vergence will mask the interior of the room, the sharp edges of the TV screen and room corners will not interfere with the actual chart. Switching attention from one report to the other is followed with head and eye movement. When you look at the leftmost dashboard the rightmost hides behind you. Only when you take a few steps back you capture the whole picture at one sight.</p><h4>Insights from Trends</h4><p>We have laid bar charts on a table. On one side of the table we have products, on the other we have mapped time. The amount of data that is visible is huge — 324 points. On 2D display you can gaze at one bar and only visual acuity distribution will limit visibility of faraway bars.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/750/1*EAm-4ahcJ1JTCwurmP_K_Q.png" /><figcaption>VR DataViz — 3D bar chart footage as seen presented on 2D display</figcaption></figure><p>When you stand in front of that table in VR and gaze at a product, now the vision is affected by that horopter plane. It enables you gaze at the bars for a product, while your brain is also filtering the nearer and farther products.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/750/1*Paq6_zFc9fJ6mtlscc9Eig.gif" /><figcaption>Perceived VR view — gazing at the bar of a product places them on the horopter plane, blurring the rest</figcaption></figure><p>The bars have clearly visible square caps, that familiarity allows us to detect changes in size and contributes to depth perception. They are also placed on a grid at the bottom and with bar walls in parallel to the room walls. This makes it easier to follow the array of bars for a product.</p><h4>Graphs</h4><p>The 3rd dimension allows for richer graph layouts. Naturally occurring graphs like those from social networks cannot be presented on 2D screens without having their edges overlapping. In 3D the layout constraints are much more relaxed as lines can easily go one beyond the other. In combination with head tracking, head movements triggers depth cues from vergence, motion parallax and kinetic depth effect.</p><p>The nodes of the graph can display rich data emphasized in node size and colors. However, displaying quantitative data in size can break depth queues from relative/familiar size. Thus, our choice was to use qualitative values for size, by snapping node sizes to common values. In the Twitter Graph, all users that only retweeted content are displayed using a common fixed size circle, thus triggering relative/familiar size depth cues for the masses of users, while only top connected users are displayed using far greater sizes based on the produced content.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/750/1*v87FM7rM4WCZHlk6kDydPw.png" /><figcaption>Footage of the VR Twitter Graph as it appears on 2D display</figcaption></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/696/1*jo4f0Aw9VXFLUoDIIy2Gsg.gif" /><figcaption>How the graph in VR is perceived when focusing nodes at different depths</figcaption></figure><h4>Bubble Charts</h4><p>Bubble charts share common traits to graphs. However, the represented data points are not interconnected and instead of topology they communicate data values.</p><p>This kind of data exists on 2D by the form of 2D bubble charts. In these charts the position of each bubble identifies 2 of its properties and another 2 properties can be encoded in size and color.</p><p>The 3D version of a bubble chart has 3 axes to plot onto. Another property can be represented by color. But what about size? If all data points share the same size, then this triggers depth cues from familiar size, boosting the readability of the 3 properties encoded in position. So, this is a tradeoff you can take.</p><p>The depth cues from shadows/shading are lost because the large amount of data points makes it hard to relate a bubble with its own shadow. There is one quality of the VR bubble charts that is very important through — they can easily visualize grouped clouds based on 3 properties, not just 2 like on 2D.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/747/1*IYP-rffaMKpBUpiSrWRS8A.png" /></figure><p>For our bubble chart the axes are drawn in parallel to the room corners and the desk to enable linear perspective depth cues, and we’ve decided to keep uniform sizes for the data points.</p><h4>Point Clouds</h4><p>There is a special case of bubble charts — point clouds. These usually present points of the same size but in huge quantities. A “point” in general is not supposed to have a volume in 3D or area in 2D, so consider it as a single pixel in space. These cannot generate depth cue based on familiar size, because of the very small size.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/300/1*1inqstrPPiXQ7jy_X9K89A.gif" /></figure><p>To compensate these are displayed in context that it is either very easy to rotate the point cloud and boost the depth cues obtained from parallax. Or display reading from sensors that display a well-known object where the cloud itself forms a body we are familiar with, triggering cues from relative/familiar cues for the whole group or individual clusters within the point cloud.</p><h3>AR VR DataViz Gone Bad</h3><p>We’ve seen some of the sources of depth perception underutilized in AR VR DataViz. The following examples have been taken out of their context, so I would like to apologize in advance and pay some respect to their authors before I use the images for illustration of what I think could be done better in VR.</p><h4>Bubble Charts</h4><blockquote>Real world: The biggest screen where you can analyze the 3D data.</blockquote><p>The whole world really provides huge real estate to present data, this kind of visualizations are better placed into environment. The beach is a non-uniform structure where linear perspective is totally lost. The following chart will be far better suited in an office room, with axes aligned to room walls, corners and windows.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*pumwK2Ou0Hzi3gtCX8rlqA.png" /></figure><h4>Globe Chart</h4><p>Globes in AR and VR look wonderful. Placing 3D bar charts on them has caveats.</p><p>When you rotate the globe to position a country facing you, the bars occlude themselves — that is the top of the bar hides its height, depth perception is not good enough to rely on it for communicating values. To compensate this, colors can encode the value, but then the whole thing becomes a heatmap.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/584/1*6tfMg1-tNS8DDDfpokf8dw.png" /></figure><p>Data visualization must not lie, including AR VR DataViz. The above presentation has yet another issue. It looks like a heat map, but bars at the sides are seen in their entirety while bars at focus are represented only by their cap. And this may end up in small bars at the sides appear larger in value compared to larger bars at the center.</p><p><strong>3D for the Sake of 3D</strong></p><p>Simply because you have a third dimension in AR VR doesn’t mean you must use it. If you have a two-dimensional data you may be better off presenting it on a 2D plane or with a 3D geometry objects, but still placed on simple array. It is easy to affect occlusion or form false depth perception based on relative/familiar size.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/803/1*p8IgIK0jaEQeqJBt35b2eg.png" /></figure><p>The top ring of the above presentation looks fine. August is occluded a little by June at the far left. Still enough from the August bar is visible so you can read its height without moving. The ring may present a cyclic yearly recurring pattern, so the layout of the array makes sense.</p><p>The lower ring however, represents countries — the layout doesn’t represent anything cyclic. The bars at the far left and right occlude each other so the values there are not readable, you can obtain trend information by the difference of the cap’s height, but not the overall value of the bars.</p><p>The bottom caps look the same as the top caps and as the bars are semitransparent the bottom caps limit readability at the far left and right sides of the circle. It may have been better for the bottom ring to be a 2D map placed on the ground.</p><h3>You cannot Experience an AR VR DataViz on a 2D Display</h3><p>Whether a 2D image or video, captured from AR VR experience, projecting it on your monitor limits your ability to perceive depth. All of the 3D objects are flattened on a single plane — your horopter plane, making everything perfectly clear, you no longer can focus clusters like you could in AR and VR, depth cues from vergence are gone.</p><p>You lose the context distance factors as intended action because someone else is taking the actions in the video. And because you are not immersed in the AR and VR experience, fear has little play. Presenting a VR on 2D screen ends up relying on shadow, texturing, linear perspective, relative/familiar size, height relative to horizon.</p><p>Occlusion happens, but you cannot interact with it, together with the motion parallax and kinetic depth effects, they must be explicitly exaggerated for the recording session of a 3D video.</p><p>Here are a few secrets we used to capture the videos for the Twitter Graph video on our landing page. View the following video and notice the motion blur during teleportation and the parallax effect:</p><iframe src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fwww.youtube.com%2Fembed%2FNqJddDFhDjs%3Fstart%3D130%26feature%3Doembed%26start%3D130&amp;url=http%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DNqJddDFhDjs&amp;image=https%3A%2F%2Fi.ytimg.com%2Fvi%2FNqJddDFhDjs%2Fhqdefault.jpg&amp;key=a19fcc184b9711e1b4764040d3dc5c07&amp;type=text%2Fhtml&amp;schema=youtube" width="854" height="480" frameborder="0" scrolling="no"><a href="https://medium.com/media/3df51ff2ef11cc4c13679dc09a30eb89/href">https://medium.com/media/3df51ff2ef11cc4c13679dc09a30eb89/href</a></iframe><p>During the recording we used a homemade 3rd person VR camera to capture 2D video, so the hardware allowed us to enabled motion blur and trigger kinetic depth effects - it happens very clearly near 2:15 during teleportation. The motion blur and the animated transition are supposed to also compensate for the lost intended action cues. If these are truly applied in VR they will cause motion sickness.</p><p>Then shortly after 2:15, excessive sideways movement strengthen the parallax effect on the graph. This compensates for the lost depth cues from vergence due to the lack of stereoscopic 3D display.</p><h3>Thank You!</h3><p>Thank you for reading this far. I would be happy to hear your thoughts on the matter — drop a line in the comments below.</p><p>Also, if you are interested in what we are doing, you can find out more, contact us or request a demo on our page:</p><p><a href="https://www.telerik.com/ar-vr-lab">Explore and Analyze Your Data in Stereoscopic 3D with AR and VR - Progress Telerik</a></p><p>For future blogs, follow our publication “<a href="https://medium.com/telerik-ar-vr">Telerik AR VR” here on Medium</a>.</p><p>If you are into short updates, follow us on Twitter: <a href="https://twitter.com/atanasovg">Georgi Atanasov</a>, <a href="https://twitter.com/deyan_yosifov">Deyan Yosifov</a>, <a href="https://twitter.com/ZapFanatic">Hristo Zaprianov</a> and me <a href="https://twitter.com/PanayotCankov">Panayot Cankov</a>.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=b1cf906d6831" width="1" height="1" alt=""><hr><p><a href="https://medium.com/telerik-ar-vr/perception-of-stereoscopic-dataviz-in-ar-and-vr-b1cf906d6831">Perception of Stereoscopic DataViz in AR and VR</a> was originally published in <a href="https://medium.com/telerik-ar-vr">Telerik AR VR</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[VR Labs’ Thoughts and Takeaways From May’s Conferences]]></title>
            <link>https://medium.com/telerik-ar-vr/vr-labs-thoughts-and-takeaways-from-may-s-conferences-6f721f97bf63?source=rss----cb8d50f0f390---4</link>
            <guid isPermaLink="false">https://medium.com/p/6f721f97bf63</guid>
            <category><![CDATA[virtual-reality]]></category>
            <category><![CDATA[conference]]></category>
            <category><![CDATA[augmented-reality]]></category>
            <category><![CDATA[unity]]></category>
            <dc:creator><![CDATA[Deyan Yosifov]]></dc:creator>
            <pubDate>Wed, 22 May 2019 15:13:08 GMT</pubDate>
            <atom:updated>2019-05-22T15:13:08.794Z</atom:updated>
            <content:encoded><![CDATA[<p><em>At the beginning of this month, our VR team participated in two conferences — ProgressNEXT in Orlando FL and Microsoft Build in Seattle WA. As these conferences took place at the same time on two diagonally opposite locations in the USA we had to split the team in order to spread the VR enthusiasm on both events. In this post, I will summarize the experience of my teammates and me, sharing thoughts on how our VR demos and concepts were accepted by the audience.</em></p><h3>ProgressNEXT 2019</h3><p>This event for modern applications development took place May 6–9 in Orlando. The audience of the conference consists of people who are interested in Progress Software existing products and want to learn more about the future plans of the company. As part of Progress Telerik, our main target was to show the attendees of the event that we are actively working on enabling VR for different business use cases, solving problems in a better way than they are currently solved on 2D displays. We presented our VR applications and concepts using three separate approaches — booth presence, live demo on the CTO Keynote’s stage and a separate VR session with DataViz related presentation. Let’s see how these approaches worked from my point of view.</p><h4>Booth presence</h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*CgP-NDeRNcChoGhlBsi9dw.png" /><figcaption>Testing the VR demos on Progress Telerik’s booth</figcaption></figure><p>On the Progress Telerik VR booth we have prepared three sample demos, showing them with Oculus Go and Oculus Rift devices. The first demo is a virtual conference room that can be joined by several people, where they can discuss a sales dashboard data visualization. The second demo is a twitter graph application which visualizes twitter communication in a way that has no readable alternatives in 2D. These two applications may be seen in more details and downloaded for free from <a href="https://www.telerik.com/ar-vr-lab">our web site</a>. The third one is a healthcare application allowing remote collaboration between a doctor and a patient in a virtual medical room. This application helps the doctor to analyze the data from patients medical test and explain the results and the needed surgery procedures with a descriptive and immersive 3D visualization of a sample human body.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*mQ1bHyGFy6rJAnaOh1sSuw.png" /><figcaption>Screenshots from our VR demos</figcaption></figure><p>As expected, there was excitement for the technology and our demos from the people that visited the booth. The ones who were skeptic about the technology were rather an exception. It was exciting to see people who showed real interest in our project and were happy to take our contacts to be able to reach out after the conference.</p><h4>CTO Keynote live demo</h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*ZEm4Eoe6O3TE93Ey1TC5CA.jpeg" /><figcaption><a href="https://medium.com/@yosifov.deyan">Deyan </a>as a doctor and <a href="https://medium.com/@atanasovg">Georgi </a>as a patient on the CTO Keynote’s stage</figcaption></figure><p>About a month before the conference we realized that the CTO Keynote will be focused on AR/VR applications in healthcare, so we suggested to create a demo that can be shown on the big stage. Using the knowledge and reusable components from our existing applications, our team managed to fit the short time span and build a healthcare application that shows how collaboration between a doctor and a patient can be achieved in VR. And voilà — one of the most exciting moments from ProgressNEXT for me may be seen in the photo above! I and Georgi appeared on the stage in front of the whole audience of this conference, taking the roles of a doctor and a patient. In my opinion, this presence on stage, even for a few minutes, was the best exposure of our work that managed to reach the maximal amount of attendees, who are now aware of the existence of <a href="https://www.telerik.com/ar-vr-lab">Progress Telerik’s VR labs project</a>.</p><h4>VR DataViz session</h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*-RVPmNgfYXbNmlJLoheaBA.png" /><figcaption><a href="https://medium.com/@atanasovg">Georgi </a>presenting our Sales Dashboard demo at the end of the session</figcaption></figure><p>Our separate VR session was based on a presentation titled “From 2D to Stereoscopic 3D Data Visualization in Virtual Reality”. We had the chance to present for about 45 minutes our VR DataViz explorations explaining the following points:</p><ul><li>What is DataViz in general and why it is important for businesses?</li><li>How Stereoscopic display works, how it differs from the standard 2D display visualization and how it is similar to the way we see the actual world around us?</li><li>Why 3D DataViz is better than the 2D alternative when talking about real 3D visualizations on a stereoscopic display?</li></ul><p>At the end of the session, we showed our Sales Dashboard and our Twitter Graph demos, emphasizing on the strengths of 3D DataViz in VR. After the demos, there were several questions from the audience showing that the people were interested in the technology and wanted to know how they can apply it to their existing businesses. Some of the attendees came to us after the session to discuss concrete ideas and scenarios which they see valuable for their business.</p><h4>ProgressNEXT summary</h4><p>We did our best to present our ideas and show our working projects, trying to reach the largest amount of the audience at this conference. We were happy to have these chances for meeting people from different business verticals and discuss with them what values the VR technology can bring to their businesses. There were conference attendees who came to us asking for more details on concrete scenarios. They took our contacts and chances are that some of them are going to reach out to us when the time comes for them to implement a VR solution.</p><h3>Microsoft Build 2019</h3><p>This is one of the largest conferences based on Microsoft technologies with over 6000 attendees. It took place May 6–8 in Seattle. Our team targets at this conference can be organized into three main groups — booth presence with VR demos, build connections with potential partners, and research what others are working on in the AR/VR area. Let’s see how this event went from our point of view based on these three groups.</p><h4>Booth presence</h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*QlNxfeX1We2LUH2PXx-Yog.jpeg" /><figcaption>Testing our demos with Windows Mixed Reality device — Samsung Odyssey</figcaption></figure><p>Progress’ booth was big enough to welcome a good amount of conference attendees passing by. For VR we were showing the same three demos already mentioned for ProgressNEXT, with one small difference — the demos were shown not only with Oculus devices but with Windows Mixed Reality VR device as well.</p><blockquote>Because it is Microsoft’s conference and their VR platform is Windows Mixed Reality, we decided to port our demos for WMR devices. Being able to achieve this task in less than a week proved that our decision for building components on top of Unity was the right thing — Unity makes it really easy to target different platforms and devices.</blockquote><p>On the VR front, there was a lot of interest and enthusiasm in the demos. Most of the visitors were just experiencing the technology, some of them were even confusing the Samsung Odyssey device with a HoloLens 2 device. Developers that had some experience were happy with the demos and took our contacts. Some business people were present, those were enthusiastic about the sales dashboard demo and we gave them our fliers.</p><h4>Connections building</h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*k5fQ2Fs1Kp3fIlS66EuAPQ.jpeg" /><figcaption><a href="https://medium.com/@panayotcankov">Panayot </a>(right-hand side) testing the HoloLens 2 device</figcaption></figure><p>Besides connecting with people at our booth, we also explored what others are doing in the VR area that may be helpful for our projects as well. We met people working on Microsoft Maps for Mixed Reality, which is based on Bing Maps. They had just released an SDK and wanted to get us on it. This corresponds to our current plans for exploring the geospatial data visualization scenario in VR and we see this connection as a good opportunity to build a good use-case together.</p><p>We have also tested the HoloLens 2 device. It is a very well put device. The balance is perfect, just as advertised, the straps are very thin and a bit stretchy which holds the thing perfectly on the head, the holo rings are gone. The display has slightly larger FoV, the colors are a bit off. But the biggest improvement in v2 is the hand tracking. The controls in these devices come just naturally, which is a big leap compared to the v1. Overall there were just few HoloLens 2 devices on the conference and there were several exhibitors that still used HoloLens 1.</p><h4>Others work for AR/VR</h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/739/1*56fdzflyNGborwj1HRN59A.jpeg" /><figcaption><a href="https://medium.com/@babylonjs/babylon-native-821f1694fffc">Babylon.js native</a></figcaption></figure><p>We learned more about an open source project called <a href="https://www.babylonjs.com/">Babylon.js</a>. It is built by Microsoft and they are considering pushing deeper the JavaScript stack into 3D and by that extent VR. The main site has featured demos, one of the business cases is <a href="https://www.exploresharepointspaces.com/">SharePoint spaces</a>, showing <a href="https://www.exploresharepointspaces.com/3D-graph">this kind of experiences</a>. They are exploring the options of building a “<a href="https://medium.com/@babylonjs/babylon-native-821f1694fffc">Babylon.js native</a>”, which is “React native” for 3D. Allowing the JavaScript developer to quickly/cheaply prototype decent VR experiences may disrupt the Unity ecosystem. If played well, this may democratize the VR for JavaScript developers ecosystem as well.</p><h4>MS Build summary</h4><p>Honestly, we expected some more AR focus from Microsoft, however, most of the sessions were focused on Azure and other Microsoft projects which are currently unrelated to the Windows Mixed Reality topic. We managed to share contacts with some attendees who were interested in our work and probably some of them are going to reach out to us when they start looking for an implementation in their businesses. We are also happy about making some contacts that may result in future partnership opportunities on mutual projects.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=6f721f97bf63" width="1" height="1" alt=""><hr><p><a href="https://medium.com/telerik-ar-vr/vr-labs-thoughts-and-takeaways-from-may-s-conferences-6f721f97bf63">VR Labs’ Thoughts and Takeaways From May’s Conferences</a> was originally published in <a href="https://medium.com/telerik-ar-vr">Telerik AR VR</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Motion Recording for Oculus Avatars & Game Objects in Unity With VR Labs’ Free MotionTool]]></title>
            <link>https://medium.com/telerik-ar-vr/motion-recording-for-oculus-avatars-game-objects-in-unity-with-vr-labs-free-motiontool-1299aee5223f?source=rss----cb8d50f0f390---4</link>
            <guid isPermaLink="false">https://medium.com/p/1299aee5223f</guid>
            <category><![CDATA[unity]]></category>
            <category><![CDATA[oculus]]></category>
            <category><![CDATA[virtual-reality]]></category>
            <category><![CDATA[free]]></category>
            <category><![CDATA[360-video]]></category>
            <dc:creator><![CDATA[Hristo Zaprianov]]></dc:creator>
            <pubDate>Thu, 16 May 2019 14:59:05 GMT</pubDate>
            <atom:updated>2019-05-16T14:59:05.674Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*cl9-MMPWUv5Alw1TCz3j0g.jpeg" /><figcaption>The sample scene that come pre-built with the package</figcaption></figure><p><em>This article is the sequel to the </em><a href="https://medium.com/telerik-ar-vr/capturing-360-video-from-a-vr-dataviz-application-trials-and-errors-8e70bab0bd3"><em>article written by fellow VR dev Deyan Yosifov</em></a><em> about his endeavors trying to record a high-quality (stereo) 360° video. This time, I will go into more detail about the tool used for recording the motions of the Oculus avatars and some “common” objects like the chart and input pointer, prior to recording the video. MotionTool was written out of necessity, since there were no ready-to-use tools, plug-ins or assets in Unity’s Asset Store or elsewhere (and still aren&#39;t today), which get the job done. So if you want to record the movement of Oculus avatars or plain Unity objects, or are curious how this can be accomplished, then </em><a href="https://www.telerik.com/download-trial-file/v2-b/arvr-motion-tool?utm_source=motion-tool-blog"><em>download our free MotionTool</em></a><em> and give it a try.</em></p><h3>Recording Avatars, but why?</h3><p>One possible reason, as mentioned above, is the need to record a stereo 360<em>° </em>video containing Oculus avatars among other stuff. But being able to record and replay the actions of avatars opens up a whole new world of possibilities. You can:</p><ul><li>have a prerecorded avatar that greets the user and guides him through your experience.</li><li>simulate different scenarios in environments like classrooms, workplaces, public institutions, banks, hospitals and so on.</li><li>use avatars to teach or demonstrate certain actions inside your application</li><li>prerecord avatars and replay them as reactions to your user’s actions</li><li>use recorded avatars for aiding your development and testing your application</li><li>use Avatars as NPCs</li></ul><p>There are many more ways in which to make use of prerecorded avatars and in the end it’s up to your own, specific use-case. But all those cases require the same prerequisite, namely the ability to record and replay them. This is where our tool comes into play and gives you value by saving you time and pain.</p><h3>But why a tool?</h3><p>Why would you bother using a tool in the first place? If you are reading this article, then you probably know the answer already — Oculus do not have a tool of their own that does that, neither did we find any 3rd party tool that does it. The other, more important reason is, that it’s a bit of a complicated process and it’s lacking transparency. Oculus have created a mechanism to transfer avatar pose information over the network, as the core idea of the avatar construct as such is to give its users an appearance, which improves the social experience and interaction between them. But the way they accomplish this is by writing binary “packets” at fixed time intervals (most commonly 30 times a second) and sending them over the network, where they are read by the other participants or clients, “decoded” and applied to the avatar that represents the person that sent them. The process of encoding and decoding the packets and applying them to some avatar is completely opaque to the developer, so they have to use those packets themselves, whether they like it or not. This might be a bit too “low-level” for many developers, especially when they have other tasks at hand and their time is precious. Based on those packets and a couple of examples from the Oculus SDK, we created our tool, which can store all the avatar pose information for a given period of time in a separate file. It can later apply the stored poses on one or more avatars, thus creating something like “canned” avatar animations.</p><h3>How to use it?</h3><p>The tool comes ready with examples and there is a series of tutorial videos, which I encourage you to watch before using the tool for the first time:</p><ol><li><a href="https://www.youtube.com/watch?v=ppDjy4bTy2M">Baseline tutorial</a></li><li><a href="https://www.youtube.com/watch?v=_HP-smcbnXQ">Working with Avatars</a></li></ol><p>It contains different Unity components that you can attach to GameObjects in your scenes and assign them the objects that you want to record or play back on. The tool stores the data in its own .asset file formats. It also has a couple of synchronization components in case you want to record or play back to/from multiple data files at once. If your use-case is similar to ours, i.e. you want to play back the motions over the same avatars and objects that you record from, then you can use our special “Record Director” editor utility. It can automate the setup process entirely, you just need to tell it which avatars and objects you want to record and it takes care of everything.</p><p>At the time of writing this article, you can use the MotionTool for recording only in the Unity Editor. You can’t record in a built Player, no matter the target platform. You can though, play back recorded data in a Player.</p><h3>Share your feedback</h3><p>Share your experience with the tool, send us suggestions for useful features or improvements, what you would like to see and what would be beneficial to you (for example record-in-player functionality) or report issues or bugs that you have stumbled upon — every feedback is greatly appreciated and will help us improve even further!</p><p>We are looking forward to get in touch, you can reach out <a href="https://www.telerik.com/ar-vr-lab">here</a>!</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=1299aee5223f" width="1" height="1" alt=""><hr><p><a href="https://medium.com/telerik-ar-vr/motion-recording-for-oculus-avatars-game-objects-in-unity-with-vr-labs-free-motiontool-1299aee5223f">Motion Recording for Oculus Avatars &amp; Game Objects in Unity With VR Labs’ Free MotionTool</a> was originally published in <a href="https://medium.com/telerik-ar-vr">Telerik AR VR</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[VR is Coming at Microsoft Build and ProgressNEXT 2019]]></title>
            <link>https://medium.com/telerik-ar-vr/vr-is-coming-at-microsoft-build-and-progressnext-2019-a69b88da54ab?source=rss----cb8d50f0f390---4</link>
            <guid isPermaLink="false">https://medium.com/p/a69b88da54ab</guid>
            <category><![CDATA[microsoft]]></category>
            <category><![CDATA[data-visualization]]></category>
            <category><![CDATA[virtual-reality]]></category>
            <category><![CDATA[unity]]></category>
            <dc:creator><![CDATA[Georgi Atanasov]]></dc:creator>
            <pubDate>Tue, 30 Apr 2019 12:14:54 GMT</pubDate>
            <atom:updated>2019-04-30T12:14:54.304Z</atom:updated>
            <content:encoded><![CDATA[<p>We are just a few days away from two big conferences that our <a href="https://www.telerik.com/ar-vr-lab">VR Labs team</a> will be attending and we are super excited to have the chance to meet people and talk about Virtual Reality.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*TqWqgMxiu_LLnpFjSY5CRA.jpeg" /><figcaption><a href="https://www.microsoft.com/en-us/build">Microsoft Build</a> and <a href="https://www.progress.com/next">ProgressNEXT</a> 2019</figcaption></figure><p>VR-wise we are well equipped with three different applications and use cases that feature dynamic DataViz, multiplayer (networking), avatars, VoIP, controller integration and more:</p><ul><li><a href="https://www.telerik.com/download-trial-file/v2-b/arvr-sales-dashboard?utm_source=msbuild_progressnext_blog">Sales Dashboard</a></li><li><a href="https://www.telerik.com/download-trial-file/v2-b/arvr-twitter-graph?utm_source=msbuild_progressnext_blog">Twitter Graph</a></li><li>A fresh new demo, yet unofficial, related to healthcare:</li></ul><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*nHEK-L5-aEXbvmLhengerQ.jpeg" /><figcaption>Our new to-be-released demo, related to a use case in healthcare</figcaption></figure><p>We also have brand new beautiful T-shirts to give away to VR enthusiasts that play with the demos:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*5oFLQQj12AjCvEZlV7lPcw.jpeg" /><figcaption>VR Labs new branded T-shirt</figcaption></figure><p>So, if you are looking to enable VR for your business we will be happy to chat with you and find a way to help.</p><h3>Microsoft Build</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*qhPd4us4Ns7nQcVI10v3BA.jpeg" /><figcaption><a href="https://www.microsoft.com/en-us/build">Microsoft Build</a></figcaption></figure><p><a href="https://mybuild.techcommunity.microsoft.com/sponsor/9691">Progress</a> is a silver sponsor of MS Build and we will be exhibiting our entire set of product families at booth 300. Our VR demos also have their place at the booth and if you drop by, you will be able to experience the applications and see how this technology can add value in business use cases. We have a Windows Mixed Reality Samsung Odyssey+ headset as well as stand-alone Oculus GO devices.</p><h3>ProgressNEXT</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*7wPmzm1XrhWYMeysylWu8w.jpeg" /><figcaption><a href="https://www.progress.com/next">ProgressNEXT</a></figcaption></figure><p>This is the largest global gathering of the Progress customer, partner and developer community, where people come together to connect, learn and collaborate. Plan your own agenda with 150+ sessions, trainings and workshops.</p><p>Top three reasons to attend are:</p><ol><li>See how new technologies can work for you and help your business save time and money</li><li>Meet some of the best technology strategists and bright minds across diverse industries</li><li>The Expo Hall — see the technologies of today and tomorrow in action</li></ol><p><a href="https://www.telerik.com">Progress Telerik</a> will have a dedicated booth in the expo hall and our VR demos will have a dedicated section there.</p><p>I and <a href="https://medium.com/u/e47fdddbfaaf">Deyan Yosifov</a> will also have a dedicate session on <a href="https://www.progress.com/next/sessions/from-2d-to-stereoscopic-3d-data-visualization-in-vr">Stereoscopic 3D DataViz</a> that we are positive you will find interesting:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*6x9kJ393L8-Dg6MbsANljg.jpeg" /><figcaption>Stereoscopic 3D DataViz session at ProgressNEXT</figcaption></figure><h3>Already Excited?</h3><p>I can’t wait to meet you in person and talk anything VR. See you there!</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=a69b88da54ab" width="1" height="1" alt=""><hr><p><a href="https://medium.com/telerik-ar-vr/vr-is-coming-at-microsoft-build-and-progressnext-2019-a69b88da54ab">VR is Coming at Microsoft Build and ProgressNEXT 2019</a> was originally published in <a href="https://medium.com/telerik-ar-vr">Telerik AR VR</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[How to Present Your AR and VR Application]]></title>
            <link>https://medium.com/telerik-ar-vr/how-to-present-your-ar-and-vr-application-87260a8b0646?source=rss----cb8d50f0f390---4</link>
            <guid isPermaLink="false">https://medium.com/p/87260a8b0646</guid>
            <category><![CDATA[presenting]]></category>
            <category><![CDATA[vr]]></category>
            <category><![CDATA[ar]]></category>
            <category><![CDATA[data-visualization]]></category>
            <category><![CDATA[presentations]]></category>
            <dc:creator><![CDATA[Panayot Cankov]]></dc:creator>
            <pubDate>Fri, 29 Mar 2019 15:36:44 GMT</pubDate>
            <atom:updated>2019-03-29T15:36:44.675Z</atom:updated>
            <content:encoded><![CDATA[<p>You are a smart person investing your time in climbing the VR learning curve. You’ve obtained some skill or have peers that help with development. You have a few VR prototypes. What is next? You create a demo that targets a very specific business problem. You solve it better in VR than you could with traditional 2D devices. You go to businesses and impress them. You prove that you are capable to develop a VR solution. If you have the right solution for the right problem and pitch it right, you will eventually get some funding — either internal for your company to drive innovation, or externally to crate a VR product for others. Then you grow.</p><p>One of the most important phases of project management is validation. Ideally you will not invest time in development before you validate the idea for your product. VR is old, View-Master stereoscope is patented 1939, but still very few people have experienced modern VR technology.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/623/1*XAU7wC5vWC8GQks5RwnPqA.png" /></figure><p>Here is one of the unique features of this technology. If you must create a web-based solution, a sketch prototype is enough to express your idea and talk to customers. That is because they are preconditioned by seeing thousand such projects and little is left to the imagination.</p><p>VR on the other hand — they have not experienced the technology, they are uncertain of the benefits, they are uncertain of the technical challenges. Low fidelity prototype, presented in the form of video or images, does not deliver the necessary credibility nor proves the technology capabilities of your team.</p><p>You are now making high fidelity prototypes and present them on real devices.</p><h3>Presenting AR</h3><p><a href="https://www.telerik.com/ar-vr-lab">We’ve been doing AR and VR R&amp;D</a> for quite some time. We have learned a lot by presenting our AR and VR applications since the beginning of our journey.</p><h3>Microsoft Build 2018</h3><p>For Microsoft Build we created the HoloStock application for HoloLens and presented it as part of a social meetup and later at the Progress booth. The application was high fidelity demo, with some Data Visualization controls. Since it was our first public appearance, we run several test runs inside the company and gathered and addressed all the feedback we could.</p><h4>Spectator View Marketing Videos</h4><p>For marketing materials, we implemented networking and set up the HoloLens spectator view. At its core by that time this involved running a capturing application on a computed wired to a DSLR. The setup was too heavy for what we could place at the night club or at the booth. But we had to do it to capture videos to publish on our site. And to have something to show first-time users what to expect when running the app.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/624/1*JlpghBLpa0n_CcP3gBsCBw.png" /><figcaption>Hardware involved in running HoloLens 1 Spectator View</figcaption></figure><h4>A Night in the Club</h4><p>The day before Microsoft Build 2018 we gathered people from the IT industry in a gorgeous night club. To talk, to socialize, to present our demo. We planned to connect the out-of-the-box streaming of the two HoloLens devices to projectors on a big screen. While the devices and the app performed well, the HDMI cables between our laptops and the projectors failed. People were interacting with the app and we had little idea what they are doing.</p><h4>At the Booth</h4><p>The app was well polished, people were quickly getting how to use it. We continued without the streaming on the following days at the booth.</p><h4>Going Home</h4><p>We had to leave back home, with some key take-away:</p><ul><li>AR headsets obstruct communication</li><li>First-time users may feel awkward from the unknown</li><li>You must see what your users are doing, sometimes they need guidance</li><li>The technology generates a lot of excitement</li><li>You cannot teach people how to use headsets with videos</li></ul><h3>ProgressNEXT 2018</h3><p><a href="https://www.progress.com/next">ProgressNEXT</a> is an US conference organized by <a href="https://www.progress.com/">Progress</a>. It is a place to get inspired, get connected, get creative. It is the largest global gathering of Progress customer, partner and developer community.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/623/1*n4bx5WzJ3EjFeD7GpAa45Q.png" /><figcaption>Demonstrating our HoloLens demo at ProgressNEXT 2018</figcaption></figure><p>We had presence on a booth there and were demonstrating our demo in a similar way to Microsoft Build.</p><h3>Presenting VR</h3><p>Our focus has always been in delivering a <a href="https://medium.com/telerik-ar-vr/data-visualization-taken-to-the-next-dimension-through-vr-6aecca2d662">data visualization suite</a>. VR offers a much better arena for immersive data visualization compared to AR. Head-mounted AR focuses at on-field workforce and assistance.</p><p>We developed the next set of demos for the <strong>Oculus Rift </strong>and <strong>Oculus GO </strong>VR devices.</p><h3>DevReach and ISTA Conference 2018</h3><p>These are two events that happened in a same week. And our team talked at both of them.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*noiwo8IH-6uew4-EzMVoWw.png" /></figure><p><a href="https://devreach.com/">DevReach</a> is a conference organized by Progress at Sofia. We gather developers from all over the world, for 2 days. This time we were not only doing demos on our booth, we were also presenting. And <a href="https://www.istacon.org/">ISTA</a> is a QA focused conference, again in Sofia.</p><h4>Presenting on a Big Screen</h4><p>The presentation starts with intro of what AR and VR is, use-cases, the projected AR and VR economy growth, and ends with a live demonstration of our applications.</p><p>We made training presentations internally in the company a few times. The presentation of the demos was done using Oculus Rift on a laptop. The laptop monitor was mirrored onto the big screen.</p><iframe src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fwww.youtube.com%2Fembed%2FqqyfNZehxws%3Fstart%3D1523%26feature%3Doembed%26start%3D1523&amp;url=http%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DqqyfNZehxws&amp;image=https%3A%2F%2Fi.ytimg.com%2Fvi%2FqqyfNZehxws%2Fhqdefault.jpg&amp;key=a19fcc184b9711e1b4764040d3dc5c07&amp;type=text%2Fhtml&amp;schema=youtube" width="854" height="480" frameborder="0" scrolling="no"><a href="https://medium.com/media/f63b2881b6a4f84058496f0ed7467a7f/href">https://medium.com/media/f63b2881b6a4f84058496f0ed7467a7f/href</a></iframe><iframe src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fwww.youtube.com%2Fembed%2FhtZWYDEZEXc%3Fstart%3D1761%26feature%3Doembed%26start%3D1761&amp;url=http%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DhtZWYDEZEXc&amp;image=https%3A%2F%2Fi.ytimg.com%2Fvi%2FhtZWYDEZEXc%2Fhqdefault.jpg&amp;key=a19fcc184b9711e1b4764040d3dc5c07&amp;type=text%2Fhtml&amp;schema=youtube" width="854" height="480" frameborder="0" scrolling="no"><a href="https://medium.com/media/bed62406d333aad7310f684c2c153486/href">https://medium.com/media/bed62406d333aad7310f684c2c153486/href</a></iframe><p>This kind of demonstration reaches much broader audience. Even in smaller numbers it is much easier to present a VR application to group of peers or business clients like this instead of letting them take turns with the headset. However, this comes with challenges.</p><p>Giving a presentation of the application and then inviting the people to try the demo on the booth works well.</p><h4>Presenting at a Booth</h4><p>Presenting at a booth is fun. It makes new friends and twitter followers. But whether you are in a small room demoing to your peers or your boss, or on a booth for an event, you need a high-fidelity demo app and you need them to experience it.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/623/1*fRNRmh52lwBggV56wrztcA.png" /></figure><p>Just like you can’t teach a person to ride a bike on a video — you can’t teach a person what the VR application is on a video.</p><p>These kind of demos however:</p><ul><li>Blocks you out of the user’s view</li><li>You cannot explain gestures</li><li>The headset obstructs their hearing</li><li>You must watch out for symptoms of motion sickness</li><li>You must accommodate for glasses</li><li>You must take case of hygiene</li></ul><h3>Spectator View</h3><p>At this point it was clear that for presenting your applications you need two things:</p><ul><li>Be able to show it in front of a crowd on a TV like screen</li><li>Be able to guide first time VR users through your app</li></ul><p>While experienced user will work flawlessly with a well-done app, first-time users will need some guidance. They are not used to the controllers. They are not used to the system gestures. It takes several minutes to go through the system welcome apps and get known with the controller and gestures, but this moves the focus away from your app.</p><p>We had two options at that moment.</p><h4>Use System Streaming</h4><p>One option was to use the mirroring or streaming when doing demos on the booths and presentations. This thing is supported by the devices, both HoloLens Oculus Rift and Oculus GO have capabilities to stream. However, during presentations subtle head movements shake severely the picture. You must present very carefully trying to keep your head steady. That extra attention draws away focus from your talk. At booths, the standalone devices spend additional power to capture, encode and transmit the video. This generates additional heat and wears of the batteries faster. The quality of the image may be reduced for the users.</p><h4>VR Spectator View</h4><p>The second option was to implement a spectator view for VR experiences.</p><p>Spectator view is rendered on a PC, thus it required proper networking. Collaboration is one of the key pillars for AR and VR, so we were going to implement networking anyway.</p><p>It shows us what the first-time users experience while in the app. It gives us precise control on the camera that the viewers see on the big 2D screen. It smooth movements like a steady-cam so it looks good when streamed on a big screen. Requires less attention from the presenter.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/624/1*fEzi2SMGP9xz3bUvBPn1HQ.png" /><figcaption>Steady-cam hardware</figcaption></figure><p>The stream must look like shot with a steady-cam. However, in the virtual experience you don’t need all that hardware. Instead, software simulates the camera movement.</p><p>We decided to go for it!</p><h3>Iterations, Iterations</h3><p>After the DevReach and ISTA conferences, we continued giving presentations, in similar format, to prospects that reached to us.</p><p>Internally at the company, we have meeting rooms where we used the following setup — we connect a laptop to a TV, we present the app using Oculus Rift with display mirrored to the TV, then we use the stand-alone Oculus GO devices to let the prospects experience the app.</p><p>This setup, although in very small area and group of people, uses the exact same hardware and configuration we used on the presentations for the conferences and at the booths.</p><h4>Networking</h4><p>We were addressing iteratively the feedback and that led to building our multiplayer environment based on the new Unity networking system. The Oculus GO devices now connect to the laptop over WI-FI.</p><p>First, we integrated a way to share highlights tooltips and the laser pointers, so users can share insights on the data visualization dashboard. Then we integrated the Oculus Avatars, so users can see each other’s virtual self, and exchange head and hand gestures.</p><h4>Building the VR Spectator View</h4><p>The multiplayer made the Oculus Rift optional. With the networking in place, the laptop now could participate in the experience without a VR device — it became our steady-cam or spectator view.</p><p>The image on the laptop renders the same 3D scene but designated for 2D displays. The camera that it uses has an algorithm to follow some of the users of the Oculus GO devices. The camera will interpolate linearly its position and rotation, so the movements are in real time, follow closely the presenter, but are also smooth.</p><p>For Full-HD or 4K TVs the picture rendered by the spectator view is in native resolution. It is not up-scaled from the headset resolution to the TV resolution. This further improves the quality.</p><h4>Telerik Campus Presentation</h4><p>With the networking and spectator view in place we had the chance to present at a local meetup before 30 people. Everything went smooth. HDMI cables to the projectors failed us again. This time we had replacement!</p><h4>Website Redesign</h4><p>At this point we were ready to redesign our website and include our new demos. You can check the site at:</p><blockquote><a href="https://www.telerik.com/ar-vr-lab">telerik.com/ar-vr-lab</a></blockquote><p>The presentation setup allowed us to also capture the promo videos for our two demo applications for our website. Smoothing the camera movements and locking its up direction to always face up — to prevent side way tilt. This gave the video a professional look.</p><p>You can see the difference by comparing the following “shaky” video captured directly from Oculus Rift:</p><iframe src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fwww.youtube.com%2Fembed%2FgAjeRUd05xI%3Fstart%3D56%26feature%3Doembed%26start%3D56&amp;url=http%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DgAjeRUd05xI&amp;image=https%3A%2F%2Fi.ytimg.com%2Fvi%2FgAjeRUd05xI%2Fhqdefault.jpg&amp;key=a19fcc184b9711e1b4764040d3dc5c07&amp;type=text%2Fhtml&amp;schema=youtube" width="640" height="480" frameborder="0" scrolling="no"><a href="https://medium.com/media/71bfcba011078764ec12a8a1fe12755c/href">https://medium.com/media/71bfcba011078764ec12a8a1fe12755c/href</a></iframe><iframe src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fwww.youtube.com%2Fembed%2FNqJddDFhDjs%3Fstart%3D97%26feature%3Doembed%26start%3D97&amp;url=http%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DNqJddDFhDjs&amp;image=https%3A%2F%2Fi.ytimg.com%2Fvi%2FNqJddDFhDjs%2Fhqdefault.jpg&amp;key=a19fcc184b9711e1b4764040d3dc5c07&amp;type=text%2Fhtml&amp;schema=youtube" width="854" height="480" frameborder="0" scrolling="no"><a href="https://medium.com/media/db7863f0454ed945a18a6b6c5a9f82a7/href">https://medium.com/media/db7863f0454ed945a18a6b6c5a9f82a7/href</a></iframe><p>Capturing 2D videos was not the only thing we had to do. To enable people see the potential of data visualization for VR we also created 360 movie to put on YouTube, this is an easy distribution channel.</p><p><a href="https://medium.com/telerik-ar-vr/capturing-360-video-from-a-vr-dataviz-application-trials-and-errors-8e70bab0bd3">Read the whole story about generating 360 movies here</a>.</p><h3>Microsoft Build and Progress NEXT 2019</h3><p>These are two upcoming events and we plan to present and show the demo applications in their full potential. There are a few things that we aim to achieve.</p><ul><li>Give a talk, present the demos with high quality on big screen</li><li>Let the attendees try the apps in a shared experience</li><li>Capture videos and images for our blog and website</li></ul><p>And here is our growing check list. We would like to share it with you, it may help you present your own VR creation.</p><h4>Capture Images and Videos</h4><ul><li>After the talk you will need videos and images for reference.</li><li>Be proactive and get consent from users to capture marketing photos.</li></ul><h4>Present on Big Screen</h4><ul><li>Practice in advance.</li><li>Ask for dedicated presenter WIFI network.</li><li>If possible test your setup in the actual presentation room a day early.</li></ul><h4>Let People Experience the App</h4><ul><li>Ask for dedicated WIFI network if you depend on internet connection.</li><li>Charge the devices before the event.</li><li>Prepare power bricks for mobile VR if you expect long working hours.</li><li>Prepare extra batteries for the controllers.</li><li>Test with a few first-time users in advance.</li><li>Share the experience on a TV or laptop.</li><li>Prepare visit cards and flyers to give away.</li><li>Care for hygiene — take tissues and disposable masks.</li></ul><h3>Wrap-up</h3><p>Doing proper and quality presentations is key to promoting your AR VR product idea. I hope that you find this post useful and we will be looking forward to meeting you at Microsoft Build 2019 or ProgressNEXT 2019. If you have some comments or feedback you can share them here or contact me over Twitter:</p><p><a href="https://twitter.com/PanayotCankov">Panayot Cankov (@PanayotCankov) | Twitter</a></p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=87260a8b0646" width="1" height="1" alt=""><hr><p><a href="https://medium.com/telerik-ar-vr/how-to-present-your-ar-and-vr-application-87260a8b0646">How to Present Your AR and VR Application</a> was originally published in <a href="https://medium.com/telerik-ar-vr">Telerik AR VR</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Capturing 360° video from a VR DataViz application — trials and errors]]></title>
            <link>https://medium.com/telerik-ar-vr/capturing-360-video-from-a-vr-dataviz-application-trials-and-errors-8e70bab0bd3?source=rss----cb8d50f0f390---4</link>
            <guid isPermaLink="false">https://medium.com/p/8e70bab0bd3</guid>
            <category><![CDATA[data-visualization]]></category>
            <category><![CDATA[virtual-reality]]></category>
            <category><![CDATA[unity]]></category>
            <category><![CDATA[augmented-reality]]></category>
            <category><![CDATA[360-video]]></category>
            <dc:creator><![CDATA[Deyan Yosifov]]></dc:creator>
            <pubDate>Fri, 22 Mar 2019 14:01:00 GMT</pubDate>
            <atom:updated>2019-03-22T14:01:00.884Z</atom:updated>
            <content:encoded><![CDATA[<h3>Capturing 360° video from a VR DataViz application — trials and errors</h3><p><em>This post is inspired by my experience with building a 360</em>°<em> video for </em><a href="https://www.telerik.com/ar-vr-lab"><em>Progress Telerik’s AR-VR web page</em></a><em>. The video purpose was to show our “Sales Dashboard Data Visualization” demo from the eyes of several participants in a virtual room, who are discussing the presented 3D charts. As this visualization displays a lot of text and graphics, it is important to ensure that the rendering quality is sharp enough so that the participants can easily read the presented information. Although our demo application is written in Unity and manages to provide quality rendering when running in Oculus VR devices, it turns out that with the existing capturing tools for Unity it is not an easy task to preserve this quality in a 360° video. In order to see the pros and cons of the different approaches which I have tried, you should continue reading this post.</em></p><iframe src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fwww.youtube.com%2Fembed%2F61voMOfoi0c%3Ffeature%3Doembed&amp;url=http%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3D61voMOfoi0c&amp;image=https%3A%2F%2Fi.ytimg.com%2Fvi%2F61voMOfoi0c%2Fhqdefault.jpg&amp;key=a19fcc184b9711e1b4764040d3dc5c07&amp;type=text%2Fhtml&amp;schema=youtube" width="854" height="480" frameborder="0" scrolling="no"><a href="https://medium.com/media/42160cfd508a2929f4024db963468e1c/href">https://medium.com/media/42160cfd508a2929f4024db963468e1c/href</a></iframe><h3>Why 360° video?</h3><p>First of all, let’s see what are the benefits of having a 360° video and why we wanted to provide one, together with our demo application source code.</p><ul><li>Providing the source code of the demo is not enough. Although we have uploaded it with detailed documentation on how to use it and build it for different VR devices it certainly takes time for newcomers to download it and get acquainted with the development workflow.</li><li>Providing already built APK and EXE files is still not enough. Although this would make it easier to install the demo on an Oculus device, still you will have to follow some steps for turning the device in developer mode in order to install and run the application.</li><li>Even if you manage to successfully install and run the application on a VR device, you will be the only participant in a virtual room. In order to see the full functionality of the demo, you should install the app on several devices and have a group of people that will join the virtual room together.</li><li>A 360° video would allow us to overcome the drawbacks from the previously listed approaches and practically provides the fastest way for the visitors of our web page to immerse in the demo room and see what functionalities are implemented in it.</li><li>A 360° video also provides access to a larger set of people that may want to see the demo. With it, you are not required to have a VR device. You may easily see it on your smartphone and even the visitors that are using a desktop browser will be able to drag the mouse over the browser player and look around the virtual room in any desired direction.</li></ul><h3>How to record 360° video?</h3><p>First, let’s say a few words on the scenario we wanted to record. It is a sample collaboration between several people in a virtual conference room. I and my colleagues <a href="https://medium.com/@panayotcankov">Panayot</a> and <a href="https://medium.com/@atanasovg">Georgi</a> have run the demo application on Oculus devices and have joined the virtual room over a WiFi network. During this collaboration, there is always one of us that is in “presenter mode” and points different aspects on a chart data visualization. At the same time, the others are in “viewer mode” and see in front of them whatever is currently presented. When any of the viewers requests control using the Oculus controller, he becomes the currently presenting participant. We wanted to record the video in such a way that the visitors of our web site can easily look from the position of each of the presenters in the virtual room.</p><h4>Recorder and Replayer</h4><p>Most of the tools for capturing 360° video require quite a lot of time for capturing every frame of the video. This practically means that the video cannot be captured in real time and instead we need to somehow record all moving objects in the scene so that the movement can later be replayed frame by frame. We have created two Unity scripts that will be responsible for recording and respectively replaying the changes in our scene that occur during the collaboration. In this case we have recorded the following object properties:</p><ul><li><strong>Chart transform</strong>. It changes every time the presenter moves or rotates the data visualization.</li><li><strong>Input transform</strong>. It changes every time the presenter moves his hand in order to show some aspect of the data.</li><li><strong>Avatars transform and avatar packets</strong>. These are the properties describing the current state of a participant’s avatar. Whenever someone moves his hand or head, new avatar packets are received and we should record this data in order to replay it later.</li><li><strong>Camera transform</strong>. When we project our demo on a 2D screen we use a spectator camera implementation which allows you to see from the eyes of the currently presenting user. In 2D display representation, it is fine if the camera moves together with the presenter&#39;s head. However, if we capture such movement on a 360° video and then play it on a VR device it is very likely that the VR user receives motion sickness from the intense camera movement. That is why instead of moving the camera with the head we have made it static and with the same position as the currently presenting participant. Changes in camera positioning are performed only when the presenter changes.</li></ul><p>I will not get deeper into the implementation details of the recorder and replayer scripts in this post. We plan to publish a separate post on this matter in our <a href="https://medium.com/telerik-ar-vr">Telerik AR VR series</a>, so if you are interested in the implementation details, stay tuned for one of our next blog posts.</p><h3>Tools for recording 360° video in Unity</h3><p>So now that we have set up our scenario we are ready to replay it and try the different tools for Unity to see which best suits our needs for recording a 360° video.</p><h4>The first approach — Unity built-in capturing API</h4><p>At first glance, the <a href="https://blogs.unity3d.com/2018/01/26/stereo-360-image-and-video-capture/">built-in capturing functionality</a> seems very tempting. It uses the Camera.RenderToCubemap method to project the space onto two cubes — one for the left eye and one for the right one. Then the result RenderTextures can be converted to an equirect image and a sequence of such images can easily be recorded by Unity frame recorder to a 360° video.</p><p>However, while testing this approach with our scenario we encountered the following issues:</p><ul><li>The graphics were a bit blurry even on higher capturing resolutions. This may not be an issue for some dynamic scene that does not require pixel perfect rendering but in our chart data visualization, there are a lot of text elements that become unreadable (even the ones that are close to the camera).</li><li>The RenderToCubemap method has known limitation that it does not capture any UI elements. As our visualization shows some 2D graphics positioned in 3D space in the video it turned out that these graphics are missing. We have managed to find a possible <a href="https://forum.unity.com/threads/support-for-ui-elements-in-rendertocubemap.429201/">workaround in Unity forums</a>, however, it did not entirely solve the missing graphics issue in our scenario and we needed to extend its implementation for some of the TextMesh Pro instances in our scene.</li></ul><p>As the built-in Unity capturing API did not work well in our case, we have decided to try some of the paid plugins in the asset store.</p><h4>The second approach — VR Panorama 360 PRO Renderer</h4><p>This paid asset is one of the popular assets for rendering videos in a Unity scene. It generally creates an image for every frame of the video by making camera snapshots in several directions and then stitching these snapshots to create the 360° image. After all frames are generated you can render a video file from the existing images. The video should then be injected with specific metadata that helps the video players to recognize it as 360° content.</p><p>VR Panorama asset provides a large set of options in its VRCapture script related to video format and video quality. One important option is the Capture Type. Other options regarding the quality are the Sequence Format (JPG or PNG), the Resolution and a Speed vs Quality parameter which controls the Anti-Aliasing. Let’s see what our findings are after testing different combinations of these options.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/477/1*7AX-fI0ymNsR_y7OOmr-WQ.png" /></figure><p>As you may see in the picture above there are two stereo options (the first one is with top-bottom layout while the second one is with side-by-side layout). As we target our 360° video for VR devices we would like to benefit from the <a href="https://medium.com/telerik-ar-vr/head-mounted-ar-vr-for-human-realistic-3-d-data-visualization-40f570a8a363">stereoscopic displays for human realistic 3D visualization</a>. That is why our first try is to render the video with some of the stereo options.</p><p>As we wanted to upload the video on Youtube with the best possible quality we have selected the Youtube 8k resolution preset and the Sequence Format is set to PNG (instead of the default JPG). At first glance, the result frame image looks with really good quality without blurriness of the graphics.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*QSQ-u6ztXd3dGt2go14QQQ.jpeg" /><figcaption>A sample frame recorded by VR Panorama</figcaption></figure><p>However, having a better look we have noticed that only the text and graphics in the center of the view are perfectly rendered. Looking left or right one may notice that the text and the chart bars are becoming double and this is reproducible both on the left eye image part and right eye image part.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*47V_BOoxm-WVD_7iR_0PmQ.png" /><figcaption>Stitching issues — axes labels look duplicated</figcaption></figure><p>This result feels particularly unpleasant when viewing the video with a VR headset and is totally unacceptable for our use case where the viewers should be able to easily read the labels in the chart. We have tried several options changing the resolution and the sequence format, but whenever the Capturing Type is set to some of the stereo options this double vision issue is persistent. Most probably it results from some error during the stitching algorithm that combines images from different view angles into a single 360° image.</p><p>With these issues in the stereoscopic capture type, our only option with VR Panorama asset was to try the monoscopic rendering. With this option, we won’t be able to benefit from the sense of depth provided by the VR devices, however, people will still be able to enter our virtual room and look around from the presenters&#39; positions in all direction. So here is what our test with monoscopic rendering managed to produce for a single frame of the video using the maximum quality setting both for resolution and anti-aliasing:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*PgvD83l6Xaf5_gJMMaat8g.png" /></figure><p>Generally, the graphics were captured really sharp and all texts were easily readable without any double vision effects. When testing the video on an Oculus VR device it also looks sharp and with good quality. The only issues we had during the full video capturing were related to the performance and memory consumption of the VR Panorama tool — it took more than 10 hours to capture a 4-minute long monoscopic video and what is worse — the capturing process often crashes in the middle of this process. The calculations for generating every frame as a high-resolution PNG image are also very greedy of hard drive space — for a 4k monoscopic video, the images retained about 100 GB of memory. For 8k video, the needed resources are more than twice that big.</p><p>So, let’s sum up the pros and cons of the VR Panorama asset in our use case.</p><p>Pros:</p><ul><li>We managed to record 8k monoscopic 360° video which was looking fine on Youtube both for desktop, for mobile and for VR headsets.</li><li>The tool is easy to use and has good documentation on the different provided options.</li></ul><p>Cons:</p><ul><li>Recording stereoscopic video has stitching issues which lead to unacceptable quality in our scenario.</li><li>The process of capturing the video takes a lot of time as it required rendering of one big image for every frame.</li><li>The process of capturing the video requires a lot of disk space as the frame images are kept on the hard drive before the video generation.</li><li>The capturing process often crashes the Unity editor for longer videos — in our case, the crash occurred when the captured video length was longer than 2 minutes.</li></ul><h4>The third approach — AVPro Movie Capture</h4><p>As we could not achieve our go to record a stereoscopic video so far, we have decided to test one of the other paid assets from Unity Asset Store. AVPro asset provides a free version with 10 seconds recording limit which was enough for us to try its capabilities. In general, this asset uses a similar approach as VR Panorama and generates high-resolution PNG images for each frame. Here is what our findings showed when trying AVPro and comparing it to the VR Panorama asset.</p><p>Pros:</p><ul><li>AVPro managed to render sharp and high-quality images when rendering in stereoscopic mode.</li><li>AVPro frames did not have the VR Panorama issue with double-vision text and graphics when looking away from the central view direction.</li></ul><p>Cons:</p><ul><li>When viewing the result 360° video with a VR device the chart labels were blurry and hardly readable. This is probably related to an issue with correctly overlapping the left and right eye images and the result was unacceptable for our use case.</li><li>The rendering of the frames seemed even slower compared to the VR Panorama asset.</li></ul><p>As we could not create a stereoscopic video with acceptable text quality we decided not to test further this asset. We were already able to create a good quality monoscopic video with the VR Panorama tool.</p><h4>The fourth and last approach — Facebook 360 Capture SDK</h4><p>As a last try to capture stereoscopic video we tested the <a href="https://github.com/facebook/360-Capture-SDK">FBCapture SDK</a>. It was looking very promising in sense of speed, however, it turned out that the produced quality was not acceptable for us (even for the monoscopic scenario). It uses the same RenderToCubemap method as in our first approach but provides an easy to use FBCapture prefab which has a variety of options for controlling the output quality. Because of its easy usage and really performant rendering, I still believe that it is worth mentioning the pros and cons we have found out while testing this SDK.</p><p>Pros:</p><ul><li>Real-time capturing. As it uses RenderToCubemap method combined with shaders for creating the video frames it does not take hours to render the video which is a good advantage compared to the previous two approaches.</li><li>Small memory consumption. The result MP4 files are also some times smaller compared to the ones produces with VR Panorama and AVPro assets.</li><li>Easy to use — simply drag and drop the FBCapture prefab and use its Hotkeys for start and stop encoding the video.</li></ul><p>Cons:</p><ul><li>The SDK restricts the maximum size for video capturing to 4k resolution. Both VR Panorama and AVPro were allowing 8k resolution.</li><li>The captured graphics are not sharp enough which makes the labels blurry and hardly readable. This is reproducible for both monoscopic and stereoscopic videos.</li><li>The usage of RenderToCubemap method comes with the limitation for capturing UI elements. As in the first approach we had to try <a href="https://forum.unity.com/threads/support-for-ui-elements-in-rendertocubemap.429201/">the workaround from Unity forums</a> with some additional implementation modification in order to make it work for our scenario.</li><li>We could not make the stereoscopic video to work in any of the 360° video players. In order to capture stereoscopic video with this Facebook SDK, you should select the RGBD_Capture texture format which creates a gray depth image parallel to the original monoscopic 360° image. However, neither Youtube nor Facebook video players were able to recognize this video format and we could not find any metadata injector that could make it recognizable. All metadata injectors seem to be working with top-bottom or side-by-side image formats and none of them is providing the option for a depth channel.</li></ul><h3>Conclusion</h3><p>Although our demo application manages to show text and graphics with sharp edges on VR stereoscopic displays, we were not able to find a suitable 360 capturing tool that manages to persist this visualization quality in a stereoscopic video. That is why we ended up creating a monoscopic video which loses the sense of depth, but at least allows the viewers to easily read the text and graphics presented in our demo application.</p><p>I hope this shared experience will be helpful and time-saving for someone who is about to start their own challenge of capturing a 360° video. Feedback is welcome — feel free to share your thoughts about our or your own experience with the available tools for Unity.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=8e70bab0bd3" width="1" height="1" alt=""><hr><p><a href="https://medium.com/telerik-ar-vr/capturing-360-video-from-a-vr-dataviz-application-trials-and-errors-8e70bab0bd3">Capturing 360° video from a VR DataViz application — trials and errors</a> was originally published in <a href="https://medium.com/telerik-ar-vr">Telerik AR VR</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Data Visualization Taken to the Next Dimension through VR]]></title>
            <link>https://medium.com/telerik-ar-vr/data-visualization-taken-to-the-next-dimension-through-vr-6aecca2d662?source=rss----cb8d50f0f390---4</link>
            <guid isPermaLink="false">https://medium.com/p/6aecca2d662</guid>
            <category><![CDATA[unity]]></category>
            <category><![CDATA[data-visualization]]></category>
            <category><![CDATA[dataviz]]></category>
            <category><![CDATA[augmented-reality]]></category>
            <category><![CDATA[virtual-reality]]></category>
            <dc:creator><![CDATA[Georgi Atanasov]]></dc:creator>
            <pubDate>Tue, 12 Mar 2019 14:59:47 GMT</pubDate>
            <atom:updated>2019-04-24T07:05:09.628Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*tk-5uG2WV083x8vWbReJvA.jpeg" /><figcaption>Screens from our two DataViz applications available for download</figcaption></figure><h3>Introduction</h3><p>At <a href="https://www.telerik.com/ar-vr-lab?utm_source=new_page_blog">Progress Telerik</a> we’ve been on the AR VR journey for more than a year now. The path has always been challenging and rewarding and it is now even more exciting. We learned a lot, made some mistakes and even shifted our initial strategy. In this post we will look retrospectively at our findings and talk about the new path that we’ve taken.</p><blockquote>This post assumes that when we talk AR VR, we relate to Head-Mounted Devices. Smartphone AR VR is out of our scope today.</blockquote><h3>Web Presence</h3><p>Before we continue, we have an important and super exciting announcement to make: a <a href="https://www.telerik.com/ar-vr-lab?utm_source=new_page_blog">completely new and redesigned page</a> that fully matches our current vision, strategy and short-term roadmap is now live on our company web site!</p><p>We will keep this page updated with our latest product announcements and plans. This is also the place where you can reach out to us and/or subscribe to our news and updates.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*xFUMNPhQ2N85dg6MFJfkBQ.jpeg" /><figcaption>The <a href="https://www.telerik.com/ar-vr-lab?utm_source=new_page_blog">new page on telerik.com</a> is now live!</figcaption></figure><p>Besides some plain text that explains where we are heading to and what problems we solve, the page also features delightful graphics and several exciting videos, including a 360-degree panoramic capture of one of our applications. We have even enabled our recent DataViz applications for download, including complete Unity3D projects and source code!</p><p>In case you are already curious to see our new web face in action, please go ahead, you may continue reading the post afterwards :)</p><blockquote>Visit <a href="https://www.telerik.com/ar-vr-lab?utm_source=new_page_blog">https://www.telerik.com/ar-vr-lab</a></blockquote><h3>A Retrospective Look Back</h3><p>We are developers. It is in our DNA to help other developers and businesses solve their challenges in an elegant and efficient way. Our long-term mission is to democratize the AR VR business application development and reduce the entry level barrier for developers. One way to accomplish our mission was to start building what we have most experience with — a set of UI and non-UI tools, frameworks and components for Unity3D that would reduce the entry level barrier for developers and would eliminate the need for complex user experiences to be continuously reinvented for each and every application.</p><p>Naturally, we took that path. If you <a href="https://medium.com/telerik-ar-vr/ar-vr-and-what-its-good-for-7fd9ed4e0af2">remember our first post</a>, it was published straight after Microsoft Build 2018, where we publicly revealed our Company’s plans to invest in this space. We imagined that once we create our initial UI for AR VR offering developers would jump on it and we will slowly start delivering on our mission.</p><p>Obviously that was not the case and we are on a different path today.</p><h3>Early Market State of Affairs</h3><blockquote>You cannot sell components in an early market. There is simply no need.</blockquote><p>We learned that from the feedback and traction that we gathered after MS Build ’18. Which was close to zero. Although related to Microsoft’s HoloLens 1, <a href="https://medium.com/telerik-ar-vr/top-3-reasons-preventing-developers-from-doing-ar-vr-today-cbf6379821ef">this post summarizes</a> our findings back then. The following list outlines our most recent conclusions:</p><ol><li>Most of the businesses are looking to enable AR VR in some form and take competitive advantage from this technology but they struggle to find a valuable use case that’s worth investing in.</li><li>Often times businesses have this emotional barrier to use today’s AR VR hardware as it is far from being socially acceptable. They would rather stick to conventional methods for solving a problem than putting a helmet on their heads.</li><li>This is a completely new technology. The development stack heavily involves 3D and game development-like expertise, therefore most of the businesses lack AR VR R&amp;D team(s) and consider applying this technology expensive, with a steep learning curve and with significant initial investment.</li></ol><p>As you can tell from the list, what we were trying to solve was directly the <strong>third problem</strong> — to make the development far more easier. Needless to say, <strong>that won’t work</strong> until the first two problems are solved. That is, valuable use cases to be defined and businesses being able to see value in the technology. Only then will they be ready to challenge their socially-related barrier of putting a helmet on.</p><h3>What Else Could We Do?</h3><p>We spent a lot of time to think what to do and how to do it. We challenged ourselves to leave our comfort zone of building what we can easily do — UI for AR VR — the truth is that the market is simply not there. Instead, we decided to tackle the first two problems first.</p><p>While this may sound easy, it was challenging to implement because:</p><ol><li>The use case(s) that we needed to find should clearly demonstrate the full glory of the technology.</li><li>Most, if not all, of the business verticals where AR VR adds value are occupied with various solutions, coming from small to large companies.</li><li>Our company builds horizontal solutions that fit multiple business verticals — our ultimate goal is to build a horizontal solution as well.</li><li>Just a use case on a Power Point presentation wouldn’t suffice, the technology needs to be <strong>experienced</strong>. Thus, a working application should be demonstrated, instead of text and images.</li></ol><h3>Top Values of AR VR for Business</h3><p>When we talk about “Technology Value”, we relate to:</p><h4>Immersive experiences through Stereoscopic 3D</h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*er1QMowkb8qP5eFrQ0VO8g.jpeg" /><figcaption>Image source: <a href="https://futurism.com/3d-why-we-have-stereoscopic-vision">Futurism</a></figcaption></figure><p>Like in 3D movies in theaters, Stereoscopic 3D is a technique that produces human-realistic sense of depth by displaying two slightly different images to the left and right eyes of the viewer. The ability is based on the characteristics of the human visual system.</p><p>This aspect of the technology makes our brain to literally <strong>experience</strong> every part of an application. It is just how our brain it works. The difference between conventional 2D displays and Stereoscopic 3D displays is like watching a movie of the Pyramids and being on a trip to Egypt.</p><h4>Natural interactions with digital content</h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*KPueF6iMYEoiemoo6tVTBg.jpeg" /><figcaption>Image source: <a href="https://www.vrfocus.com/2018/09/preview-project-tennis-scramble-sports-get-weird-on-oculus-quest/">VR Focus</a></figcaption></figure><p>Much like in real life, we can interact with digital content through natural gestures. For example, if I need to push a button in VR, I would simply reach out and press it with my hand. Or, if I need to move an object somewhere in space I would simply grab and move it.</p><h4>Truly social multi-people interaction and collaboration</h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*uafXslG0ER5xRzYdgIss0Q.jpeg" /><figcaption>Image source: <a href="https://www.roadtovr.com/getting-social-oculus-unifying-multiplayer-apps-persistent-avatars/">Road to VR</a></figcaption></figure><p>Because of the human-realistic experiences in Stereoscopic 3D, sharing an experience with other people builds long-lasting memories. For example, instead of being bored to death at a remote meeting, why not have it in VR where you can interact with participants as if they are next to us?</p><h3>Data Visualization</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*K6_Oys6_ygfij9_Zp2O-Sw.jpeg" /><figcaption>A screen from our Sales Dashboard application with 2 people in the same experience</figcaption></figure><p>What is the single property that is across all business segments. What lies in the pulse of every business? The answers are simple — it is <strong>data</strong>. Visual communication — or Data Visualization as we call it today — has always been a significant part of humanity. Technologies change, new ones are invented, data evolves but producing visual means from a raw data-set will always be part of how we interact with information and ourselves.</p><p>Following that logical path we asked ourselves — can AR VR add value to the way we visualize data today? Can Data Visualization take advantage of the above listed strengths of the technology?</p><p>We did some experiments and were blown away by the enormous value Stereoscopic 3D adds to seeing even simple data:</p><ul><li>Many people will argue that data is best seen on 2D. That is true, given we talk about <strong>2D-displays</strong>. 3D on a 2D display is almost every time a bad thing to do. But it is a <strong>completely different</strong> story when you see 3D on a 3D display. And when we talk about Big and Wide data, these human-realistic 3D experiences offer enormous potential, unexplored before.</li><li>Using human-natural interactions like moving charts all around the place, zooming in-and-out, walking through the data and seeing it from different angles, improves readability and understanding of data multiple times.</li><li>Presenting data to other people in VR takes full advantage of the social-like collaboration experiences and contributes significantly to the collective understanding of data.</li></ul><h3>VR or AR?</h3><p>After we defined our path and strategy, we needed to choose the right device that is suitable for displaying data visualizations in the most optimal way in terms of image quality, field of view and price.</p><p>The requirements that we have are:</p><ul><li>As large as possible field of view. Big or Wide data requires significant virtual space and viewing angle. The minimum acceptable angle is 100 degrees diagonal.</li><li>Stand-alone. Pitching a use case that requires tethering machine would be less appealing than a stand-alone device that one may carry anywhere on the planet.</li><li>Affordable price. One of the barriers for businesses to embrace the technology is the investment needed. We defined a price tag of a maximum $800.</li></ul><h3>Where is AR Today</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*0nbYWy1cDXsivcrzgUo3Gg.jpeg" /><figcaption>24 Feb 2019 — HoloLens 2 was just announced. Image Source: <a href="https://www.digitaltrends.com/computing/hololens-2-news-roundup/">Digital Trends</a></figcaption></figure><p>This engineering masterpiece features:</p><ul><li><a href="https://en.wikipedia.org/wiki/Microelectromechanical_systems">MEMS</a> displays, 52-degrees diagonal field of view</li><li>Full hand and eye tracking for natural interactions</li><li>Perfect balance for long hours of comfortable hands free work</li><li>Pumped up with field worker productivity software</li><li>Azure software sharing HoloLens, iOS and Android experiences</li><li>Enterprise oriented</li></ul><p>Producing AR capable hardware is orders of magnitude more complex than building VR hardware and this explains the $3500-ish price tag on the device.</p><p>Although a solid device, HoloLens 2 is incompatible with our requirements of 100+ degrees field of view and $800 or less price tag. These aside, the device, as it is designed, is practical for a variety of enterprise use case but <a href="https://medium.com/telerik-ar-vr/microsoft-hololens-2-and-what-it-means-for-your-mixed-reality-line-of-business-startup-bd5691b673a9">these are already taken</a>.</p><h3>Where is VR Today?</h3><p>We have a stand-alone Oculus GO device with a price tag of $200, and we are expecting Oculus Quest any time soon: “Coming in 2019. Starting at $399”.</p><iframe src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fwww.youtube.com%2Fembed%2FxwW-1mbemGc%3Ffeature%3Doembed&amp;url=http%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DxwW-1mbemGc&amp;image=https%3A%2F%2Fi.ytimg.com%2Fvi%2FxwW-1mbemGc%2Fhqdefault.jpg&amp;key=a19fcc184b9711e1b4764040d3dc5c07&amp;type=text%2Fhtml&amp;schema=youtube" width="854" height="480" frameborder="0" scrolling="no"><a href="https://medium.com/media/b3c03d50632ab1b25a2856c82c5b092b/href">https://medium.com/media/b3c03d50632ab1b25a2856c82c5b092b/href</a></iframe><p>At a nearly “impulse buy” price tag you get:</p><ul><li>Same FoV as desktop attached Oculus Rift — that is 110 degrees diagonal</li><li>Tracking with 6 DoF touch controllers for natural interactions</li><li>Shared arena-scale experiences</li><li>Excellent image quality</li></ul><p>There is a trend there, based on the better imaging capabilities and availability of VR devices — higher FoV, real life-like imaging, completely immersed experiences.</p><p>While AR is more exciting and somewhat more socially acceptable when working with multiple people in the same physical room, its price tag and limited visualization capabilities turn the tide in favor of VR.</p><h3>The Future of Data Visualization</h3><iframe src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fwww.youtube.com%2Fembed%2FiIsCGisSEVo%3Ffeature%3Doembed&amp;url=http%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DiIsCGisSEVo&amp;image=https%3A%2F%2Fi.ytimg.com%2Fvi%2FiIsCGisSEVo%2Fhqdefault.jpg&amp;key=a19fcc184b9711e1b4764040d3dc5c07&amp;type=text%2Fhtml&amp;schema=youtube" width="854" height="480" frameborder="0" scrolling="no"><a href="https://medium.com/media/9eec735c3037b4759b74d5667c4fe88f/href">https://medium.com/media/9eec735c3037b4759b74d5667c4fe88f/href</a></iframe><p>We believe AR and VR at some point will converge at a single device that will be capable to deliver both kind of virtual experiences and anything in between. Somewhere along that road AR may become better suited for Data Visualization and thanks to the cross-platform support from Unity3D, our solution will run on AR devices as well.</p><p>10 years from now we expect a single socially acceptable device that can deliver both AR and VR experiences, to be wildly available at offices. At that point the decision whether the application will run in AR or VR mode will finally be made on business needs instead of hardware capabilities.</p><p>For example:</p><ul><li>Executive live meetings that happen in the same physical room will use the devices in AR mode. These will be capable to deliver report experiences far more richer than traditional 2D displays.</li><li>Remote meetings will use the device in VR mode. These will share environments, avatars or video presence and will transfer micro-expressions. That would be a far better communication channel than 2D video conference calls. It will make people feel they are in the same room. And again, it will be capable to deliver report experiences far more richer than traditional 2D displays.</li></ul><p>You can capture a glimpse of that future:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/600/1*InNG7S8Rw9g5QTyub8HcGg.gif" /><figcaption>HoloLens 1 DataViz — same room collaboration</figcaption></figure><p>Back at Microsoft Build 2018 we had a fully functional data visualization demo displaying the bitcoin stock movements. The enthusiasm about the demo was huge, however the benefits of the third dimension were mitigated by the form factor, FoV and price of the device.</p><p>Now Microsoft Build 2019 is on the door step. HoloLens 2 is a very strong release. But at this point it is focused primarily on enterprises and field workers. It will take few years before miniaturization and mass production can cut the shape and cost of the device enough to bring AR to the state where Data Visualization will shine.</p><p>On the other hand VR is already here:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/600/1*VohRLDMcQQFQ3SDUcIoT1g.gif" /><figcaption>VR DataViz — virtual room remote collaboration</figcaption></figure><p>The VR field is fully saturated and competition has already pushed the hardware and software to a point where Data Visualization can show its full potential.</p><h3>Wrap Up</h3><p>We will take the VR Data Visualization path, aiming to help businesses find valuable use cases for AR VR and apply the technology successfully today. One of our long-term goals remains a comprehensive framework of reusable components and tools, but this will happen gradually, after the market picks up.</p><p>Technologically, we are backed by Unity3D, which is one of the very few truly cross-platform frameworks. This ensures that when AR as hardware picks up we will be prepared. Unity3D also enables us to deploy to literally any VR device that exists today and will be born tomorrow.</p><p>To fully demonstrate the potential of Data Visualization in VR, we created two completely functional applications available for download. Head-on to our new page and see what we have there for you:</p><blockquote>Visit <a href="https://www.telerik.com/ar-vr-lab?utm_source=new_page_blog">https://www.telerik.com/ar-vr-lab</a></blockquote><p>Thank you for reading this post! As usual — feedback is gladly accepted — please feel free to share it with us either on post’s comments or at our email:</p><blockquote><a href="mailto:vrlabs@telerik.com?subject=&#39;Feedback on VR DataViz&#39;">vrlabs@telerik.com</a></blockquote><p>Happy experimenting and we look forward to hearing from you.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=6aecca2d662" width="1" height="1" alt=""><hr><p><a href="https://medium.com/telerik-ar-vr/data-visualization-taken-to-the-next-dimension-through-vr-6aecca2d662">Data Visualization Taken to the Next Dimension through VR</a> was originally published in <a href="https://medium.com/telerik-ar-vr">Telerik AR VR</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
    </channel>
</rss>