Introduction to React360— Part 01

Shay Keinan
4 min readNov 9, 2017

--

Virtual-reality is being used in many industries.
Besides games Virtual-reality is being used in many fields for example: medicine, education and movies.

Because of its ability to completely immerse you in a scene the possibilities are endless.

Before diving into code I want to talk to you about virtual reality concepts, things that we, as developers must know if we want to build a virtual reality application.

So what is virtual reality in a nutshell ? Visual virtual reality is made up of 2 things: stereoscopic imaging & movement tracking.

Lets look at these 2 images. Are they identical?

stereoscopic imaging

They seem identical but if you look closely you can see a difference.

stereoscopic imaging

Stereoscopic images are based on how the human brain works, it takes two images that show the same content but from a slightly different point of view.
The offset of these images corresponds to the distance between our eyes, this distance is called inter-pupillary distance, IPD for short.

In this way we simulate the way we see the world naturally, and it gives us the perception of 3D depth.

Headset lenses are an integral part of the virtual reality experience.Why do we use headset lenses ? Because they position the images on the screen at the exact distance that they need to be to get the desired effect.

VR lenses are thick so they cause a distortion. The square that you see on the left looks caved in through the lenses. The outcome looks something like what we see on the right.

Lenses distortion

To compensate for this, we give images that are rounded out. On the left you can see a compensated image before the lenses, and on the right is the final desired image.

Lense distortion

To sum up stereoscopic imaging: by showing two slightly different images to each eye, using special lenses, we get the effect of depth.

Besides stereoscopic imaging the second thing we need to complete the illusion of virtual space is to track the movement of our body.

All VR devices track head movement so we can look around. Some devices, the more expensive ones, track body movements, so we can move around.

Of course the more tracking sensors that you have, the better the illusion of reality.

On April last year, Facebook announced the launch of React360, a new JavaScript framework, based on Three.js and React Native.

React360 allows developers to build virtual reality experiences with the help of JavaScript.

As the name implies, React360 uses the same concepts as Facebook’s existing React framework. Just like with React for standard web apps, VR developers can now use the same declarative model to write apps for their 360-degree experiences.

Just like in animation, VR apps need to be rendered at 60 frames per second, React Native has solved many of the issues that usually make this hard to do with a JavaScript application.

It’s important to know that React360 is based on React Native and Three.js.
Most of the components that we use are React Native components. The 3D rendering engine is the Three.js rendering engine.

For those of you who are not familiar with Three.js.
Three.js is a cross-browser JavaScript library used to create and display animated 3D graphics.

In our next chapter, we are going to cover React’s 360 basic components and start to write an application from scratch.

If you can’t wait and want to move on - the entire tutorial is available here -

ReactiveConf 2018

--

--

Shay Keinan

Front-end Architect @ Salesforce, Public speaker and trainer.