Would adding a nose alleviate VR sickness?
A 2020 vision is all good and dandy, but what of the near field of view?

The biggest of VRs near term challenges is arguably motion sickness, and in designing a futuristic racing game, this consideration is paramount for us.
Wednesday last, I was chatting with an advisor. During our chat, we were discussing the nature of our games motion, and how we could recreate a truly realistic point of view for a gamer to experience this motion from.
The parameters we had considered for this POV until that day were:
a. Far environment background
b. Near map foreground
c. Pod UI display
d. Pod Interior view
e. Helmet (/HMD) UI display
f. Helmet interior bevel
We had also considered:
a. eye movement scenarios to determine character placement across the FOV
b. head movement scenarios to determine first person camera POV
The ONE thing we hadn’t considered, that hit us yesterday, was the squinting movement of the eyes inwards/downwards. When we do that, we see the extension of our nose away from our face. And if we’re visualizing hands/lower bodies in VR, perhaps we should also considering visualizing the nose.
A little Google-ing later, I ran into this research project conducted by Dr. Whittinghill at Purdue last year. Sure enough — there was the nose.
As a result, there is now a g. in our list of parameters:
g. Virtual Nose view
On the face of it, it‘s probably nothing. But maybe it’s everything.
Either way, it seems worthwhile to quickly share this thought process + supporting research out to the VR content community at large.
Should all VR experiences going forward, include a nose?
Tweet your response with #NoseyVR to me. Or just add a comment below!
(Title credits: Ben Smith)