Understanding Human Anatomy Before Designing XR Experiences

Extended reality (XR)

Siddarth Kengadaran
XRPractices
6 min readApr 20, 2020

--

Human Eye:

The human eye is key to extended reality experiences such as AR, VR, or MR. Understanding how it functions helps us to make better decisions while creating interactions and interfaces for the spatial world.

The eye works together with the brain to identify patterns of light and convert them into images. In this process, the light rays reflect the object and enter through the Cornea, Pupil, and then pass through the lens before getting focused on the retina. It gets converted to electric signals by the cells on the retina. The actual image created in the retina is upside down, which is then inverted by the brain, not just that each eye creates a different view of the object, which is again processed by the brain to create a single image. It also adds details so that we can see complex shapes, depth, movement, and different colors.

The ventral stream in the brain (or. “what pathway”) leads to the temporal lobe, which helps with the object and visual identification and recognition. The dorsal stream (or, “where pathway”) drives to the parietal lobe, helps with processing the spatial location of the object.

Source: Wikipedia

The ventral visual stream helps us to recognize and categorize objects by extracting features in such a way that the size, orientation, illumination, perspective, etc. don’t matter. We remember an object by its shape and inherent features. It doesn’t matter how the objects are placed, how big or small it is, or what side is visible. There is a hierarchical build-up of in-variances first to position and scale and then to the viewpoint and more complex transformations requiring the interpolation between several different object views.

https://figshare.com/articles/Ventral_visual_stream/106794

The cells in the visual cortex respond to simple shapes like lines and curves, but more complex objects like faces it gets recognized further below the ventral stream.

The eye focuses on objects at three different distances (near, medium, infinite). Depending on an object’s distance, light rays from that object diverge at different angles when reaching the eye. The eye counters that by accommodating, i.e., changing the shape of the lens. But there is a limit to how much the ciliary muscle can compress the lens, and therefore there is a minimum distance eye can focus. Any objects closer than that distance appear blurry.

Virtual Image Creation in Head-Mounted Display’s:

To overcome this HMD (Head Mounted Display) lenses comes into play, they bend light rays and reduces the divergence angle of light from the screen to a point where the eye can focus it. The created illusion is called a virtual image.

Virtual Image creation with HMD Lenses

Though the HMD lenses help us achieve this, we still need to consider all the limitations of the human body that affect the experience.

Factors to consider while designing XR experiences

Field of View (FoV):

The field-of-view is all that a user can see while looking straight ahead. FOV is the extent of your natural vision, both in reality and in XR content. The average human field-of-view is approximately 200 degrees.

The typical FoV of XR devices currently are,

https://uploadvr.com
https://uploadvr.com/

Field-of-Regard:

It is the space a user can see from a given position, including when moving eyes, head, and neck.

Distance:

As we saw above, the human eye is comfortable focusing on objects half a meter to 20-meters in front of us. Anything too close will make us cross-eyed, and anything further away will tend to blur in our vision. Beyond 10-meters, the sense of 3D stereoscopic depth perception diminishes rapidly until it is almost unnoticeable beyond 20-meters. So the comfortable viewing distance is 0.5-meters to 10.0-meters where we can place relevant content.

Interpupillary Distance (IPD)

The measured distance between the pupils of a given user’s eyes. IPD can be understood to be something of a ‘base measurement’ that provides a foundation for scale in VR. Some HMDs allow for the physical adjustment of the horizontal displacement of the lenses to better match the individual user’s IPD.

https://www.aao.org/

Eye Relief

It is the distance from the pupil to the nearest point of the HMD display. If a viewer’s eye is outside this distance, a reduced field of view will be obtained. The higher the magnification and the larger the intended field-of-view, the shorter the eye relief. Since not all users have the same head shape, a specific range of eye relief needs to be supported in practice. For people who wear glasses, Eye Relief is particularly essential. People with glasses need longer Eye Relief so that they can still see the full field of vision through the eyepiece. Usually, more than 15mm is appropriate. However, if you increase the eye distance, the cone of the eye box becomes thinner, so maintaining both the high eye distance and the large eye box can be challenging.

https://www.displaymodule.com/

Neck movement:

The human eye can look left to right and up and down 30°-35° comfortably. This creates a reasonable field of view (FoV) 60°. Though a VR headset might have a broader display, a user’s default field of view will still be limited to at least 60°. By rotating our neck, the FoV increases to 120°; this is called peripheral vision.

The primary UI elements are placed in this area, where they are immediately accessible.

  • If a user is sitting on a static (non-spinning) chair, the direct FoV is limited to 94° horizontal space and a 32° vertical space.

Length of the ARM :

The length of a user’s arms is another vital factor, though we have different controllers, voice interfaces for interacting with objects at a distance. It is essential to consider arms while designing UI. On average, arm’s length is about 50–70 cm from the user. So we will try to place fundamental interactions at this distance.

--

--

Siddarth Kengadaran
XRPractices

Product Consultant, Enabling teams to strategize and build with conscious intention. Currently exploring Spatial Computing (XR) and AI.