UX + VR: 14 Guidelines for Creating Great First Experiences

Guidelines coauthored with Gabriela Madrid and Charles Paul Harris-White.


Virtual Reality (VR) is commonly referred to as the “Wild West” because of the opportunities for exploration and the lack of established patterns and guidelines for interactions and gestures in VR. This is exciting, but it can also require more time from content creators and can be confusing for new users as they don’t know how to control their actions across different VR experiences.

This guide is not to dictate absolute rules (we still advocate for exploration and creativity!), but rather to serve as a reference to gather general principles, and to see what others have tried and discovered.

To create this guide we reviewed over 35 articles, books, and videos from academic and industry experts to bring you the current best practices in VR. We also interviewed and observed new users playing a variety of VR experiences to test and validate these guidelines.

Some guidelines are well documented, while others are budding ideas of how things should work. Sources are included if you’d like more details.

VR is Evolving!

Think of any sources or guidelines that you think should be included in the UX + VR Guidelines? Send me a message or a tweet!


1. Give the user control of their movements

The control of the camera should always stay with the user and the application should maintain head tracking at all times. This helps the user feel immersed and in control. It is also very important in preventing motion sickness which results from the user’s viewpoint being moved without their input.

See references: 1, 6, 13, 14, 15, 17, 23, 22, 30, 31, 35

2. Limit elements that may cause sickness

The first impression is very important for new users coming back to experience VR. The following are interaction guidelines for preventing sickness from VR.

2a. Moderate use of brightness

A hard jolt from reality to VR can cause VR sickness. Having some sort of fade when starting and stopping can help users adjust²⁵. Some examples of fade include in saturation, contrast, and (most commonly) transitioning to black. Transitioning between dark scenes and bright scenes can also cause discomfort as it takes time for the eyes to adjust to the differences in light¹³. Bright whites and colors should also be used in moderation because the increased light is more tiring for the eyes.

See references: 1, 6, 14, 22, 23, 27, 32

2b. Limit acceleration

Acceleration is one of the most likely causes of VR sickness¹². This sickness comes from a difference between what the eyes see and the inner ear detects. However, the inner ear can only detect changes in acceleration, so users moving at consistent speeds do not experience the same disparity.

Some suggestions to combat VR sickness from motion are using leading indicators before starting motion, and encouraging new users to begin with slow head motions¹³. It is also recommended that new users start with shorter first exposures to help them adjust to the VR experience.

See references: 1, 3, 13, 14, 17, 22, 30, 31, 35

2c. Be cautious with images that convey movement

High spatial images such as stripes and fine textures can enhance the sense of motion and use should be minimized¹³. Flashing lights should also be avoided²¹.

See references: 14, 22

3. Help users feel “safe”

Users are opening themselves to a new experience and should feel they can trust your experience. People are especially sensitive to things getting close to their eyes, especially things that are sharp or have hard edges¹⁴. Prevent new users from feeling threatened by keeping things away from their eyes, starting them with scenes that are not too intense, and informing them what they should generally expect from your VR experience. Users should also be able to learn to control their experience quickly so they do not feel lost or out of control.

See references: 3, 4, 14, 15, 22, 30

4. Create immersive experiences

VR experiences do not need to be realistic to be believable and immersive. Following the next three guidelines can help create more immersive experiences.

4a. Create a sense of depth and dimensionality

There are more ways to create the feeling of three dimensions beyond the way the headset displays the environment to a user’s eyes¹² ¹³. These ways can include:

  • Atmospheric perspective
  • Occlusion
  • Linear perspective
  • Relative size
  • Shading
  • Texture gradient
  • Motion parallax

See references: 3, 12, 14, 15, 16, 22, 23, 31

Binaural audio explanation from Colombia Music and Computers

4b. Use binaural audio to help people feel situated in space

Audio is an important and often overlooked way to create immersion. Having audio coming from the direction of what is making the sound build the sense of realism, and can also provide cues for the user to look around¹⁶.

See references: 3, 13, 17, 22, 23, 30, 31, 32

4c. Create the appropriate avatar

There are many ways to acknowledge a user’s presence in VR, and whichever way is selected should match the narrative of the experience. Avatars can help build immersion IF the avatar and the user are in a similar alignment. However, it is less disconcerting to a user to have no body, than to have a body that doesn’t match what they are doing¹⁴. Additionally, users should be disengaged from their avatar in uncomfortable situations as people can experience a body-ownership illusion and feel the pain and discomfort of their avatar. Caution should also be used since unaligned physical and virtual movements can cause the brain to remap how it interacts with real objects.

Here is an example of an avatar far enough from reality to avoid the uncanny valley.

See references: 7, 15, 23, 32 — Create a similar avatar in comfortable situations and if it is likely to align with the user’s movements; 9 — Acknowledge user’s presence in a way that fits the narrative

5. Build for different types of users with different mental models

5a. Design & test with diverse users

People have different comfort levels, heights, hand sizes, physical and mental abilities, fears, preferences, and so forth. Designing for and testing with a variety of users can help ensure the VR experience can be enjoyed by people in many different situations. It is also important to consider design for both standing and sitting, as standing gets tiring over time.

See references: 4, 6, 16, 23

5b. Allow for customization

Allowing users to customize is another important feature to help make your VR experience enjoyable for different types of people³¹. This helps people with disabilities, injuries, and personal preference. It is also suggested to let users change the settings without having to quit.

Accessibility options for Uncharted 4 (a non-VR game)

See references: 3, 6, 16, 19, 22, 23, 32

5c. Give specific instructions

People have different experiences and mental models of the world so be specific when giving instructions¹². For example, demonstrating a gesture would be more clear than saying “pump your fist” which could be interpreted in many ways.

See references: 1, 2, 15, 30


6. Place UI where it is easy to work with and read

Vergence-accommodation effect from Wired

User Interface (UI) elements should be placed a comfortable distance away from the viewer. Elements that are placed too close (including text, weapons, and tools) can cause eye strain due to vergence-accommodation conflict. Accounts differ, but recommendations say keeping UI between 1.3 meters and 3 meters¹³. The UI should fit in the middle third of the viewing area as it is difficult for people to swivel their eyes in their sockets.

Placing text and images on a slightly curved concave surface feels more natural as the user looks around¹². Text is currently difficult to read in VR, and should be displayed big enough to be legible. One source recommended at least 20px for UI elements, but it should be tested to make sure it is big and bold enough for diverse users to read¹⁵.

See references: 2, 3, 12, 14, 15, 16, 18, 31

7. Create comfortable & sustainable interactions

Comfort should also be considered when creating the interactive elements in a scene. The most frequent gestures and interactions should be easy on the hands and body so users don’t get fatigued or injured. Objects that will be used the most should be easy to reach, with suggestions ranging between 0.75–3.5 meters away¹³ to 0.50–20 meters away². Objects should also be placed at the appropriate height. Suggestions for this include:

  • Between desk height & eye level¹²
  • -15 to -50 degrees for reading²
  • 60 degrees up and 40 degrees down³
  • 70 degrees circle of vision¹⁵

Different users should again be considered as many factors can affect what is comfortable and possible. Factors may include: height, sitting vs. standing, room scale, disabilities, flexibility, etc.

Comfortable head rotation from Mike Alger

See references: 2, 18, 19, 22 — Most used objects should be easy to reach; 6, 18, 19, 24, 25 — Make sure interactions can work on different scales; 3, 13, 15, 22, 28 — Avoid gestures/interactions that could become tiring over time

8. Use cues and prompts to help users get started

New users need help learning what they can and cannot do. Many users new to VR do not even think to look around and need some sort of prompt such as motion or audio to get them to move around the space¹⁷. Using cues such as lighting, sound, eye contact, and images can help user discover what they can do and where they should go.

See references: 5, 6, 10, 13, 16, 22, 23, 24, 27, 34

9. Make controls easy to learn and remember

People have limited memory and will not be able to directly see the controllers while in VR. Especially for users who are new to VR, the number of buttons and gestures needed to interact should be minimized. A good rule of thumb is people can generally only remember 7 chunks of information plus or minus 2¹⁷.

New users may also struggle with learning the controls since they cannot directly see them with the HMD on. Some recommendations are to help familiarize the user with the controller, draw from their existing knowledge, and show physical tools so they can be found in the VR experience²³. Buttons are also suggested for use when tasks are binary.

See references: 3, 14, 20, 24, 33 — Use familiar controls that the user doesn't need to look at; 20, 22, 24, 16 — Limit the number of buttons/gestures for new users; 22 — Show physical tools and objects so they can be found in the VR; 22 — Use buttons when tasks are binary

10. Develop natural interactions for the hands

Users should be able to use their hands as they would in real life. Controllers are recommended to not require line of sight so people can use their hands where is most comfortable to them which is usually at their sides or in their lap²¹. Occlusion of the hands and object should also be considered so that users can interact with objects in front and behind their virtual hands. One research study recommends based on the position of the representation of the hands inside a virtual object to (a) enforce physic constraints if shallow, and (b) occlude hand if it is deep in an object²¹ .

See references: 15, 21, 22


11. Build on existing knowledge of the natural 3D world

Bullets are shown on the gun in Space Pirate Trainer.

11a. Integrate interface and shortcuts into the 3D world

HUD and traditional 2D interfaces can feel unnatural and stuck to a user’s vision in the 3D environments in VR¹⁷. Instead, try integrating interfaces elements into the environment and build off how people interact in the real world. For example, ammo can be shown on the virtual weapon or users can select tools from a tool box. Shortcuts can also be represented in a way that builds on existing 3D interactions such as grabbing things out of a backpack or holster.

See references: 3, 6, 14, 16, 18, 20, 22, 25, 32

11b. Use existing knowledge of physical space

Physical objects and attributes already have lots of meaning and affordances. Use them to show how things are used and create the right kind of mood for the experience. Giving clear physical affordances and constraints can help users know what they can and cannot do³⁰. Color, shape, material, and placement can all guide the user to interact with things in the expected way.

See references: 6, 15, 22, 25, 30, 31,

12. Provide feedback & consistency

Along with affordances, all elements should also provide feedback when they are interacted with. Feedback can be visual, audio, or haptic¹. This lets users know their action was successful, or helps them understand where they went wrong. All interactions should have a distinct starting and completion state. Interactions should also work consistently across the experience so users can more quickly learn how to accomplish tasks and focus on the content of the experience.

See references: 1, 2, 5, 6, 14, 15, 17, 18, 22, 23, 27, 32, 33, 34

13. Design for precision & speed

13a. Use Gestalt and Design Principles

Principles such proximity, similarity, and hierarchy can all help users make the correct selection quickly¹³. Items should be scaled appropriately and well-spaced to ensure users can hit the correct target, without accidentally triggering targets nearby.

Gestalt Principles can be used to help show patterns. Also see Smashing Magazine’s guide.

See references: 1, 12, 13, 14, 15, 20, 23

Fitts’s Law can help us think about precision.

13b. Help users make precise selections

This is still an ongoing debate, but some suggestions to help user make precise selections include:

  • Only accepting input for the UI from a single element such as the index finger, laser pointer, ray, etc.¹⁴
  • Displaying a reticle when the user is doing fine targeting (mainly advocated for experiences without controllers)¹
  • Consider using a physical panel held by the non-dominant hand for 2D tasks. The non-dominant hand can steady the target and help with aiming²¹

See references: 1, 14, 15, 21, 22

13c. Allow for mistakes

Most VR experiences do not have an “undo” button or interaction pattern. In addition to the previous guidelines to prevent mistakes, permanent decisions should be designed with additional measures to prevent mistakes¹⁹. Appropriate affordances, warnings, and extra steps can be considered.

See references: 20, 22

14. Keep tools ready, but not distracting

Tools and UI should be clear and easy to use, but implemented so they don’t distract users when they don’t need them. Some suggestions to implement this include:

  • Provide users the ability to turn UI elements off²¹
  • Placing panels or widgets above the head for the user to pull down when needed²¹
  • Only show text when a user is close enough to it and looking directly at it¹⁴

See references: 2, 3, 14, 15, 21, 22


Some less supported suggestions are:

  • Design for multitasking so users can be aware of real life through notifications in VR²²
  • Design for arcs as human hands move in arcs rather than straight lines¹
  • Include safe poses to avoid “the Midas touch” where everything is interactive¹
  • Design for personal space, but accept social interactions

See references: 6, 15, 22


  1. A new dimension — Designing for Google Cardboard — VR design guidelines. VR design guidelines. Retrieved 2 January 2017, from https://www.google.com/design/spec-vr/designing-for-google-cardboard/
  2. Alger, M. (2015). VR Interface Design Pre-Visualization Methods. Retrieved from https://www.youtube.com/watch?v=id86HeV-Vb8&t=2s
  3. Allen, D. (2015). The Fundamentals of User Experience in Virtual Reality. Block Interval. Retrieved 6 January 2017, from http://www.blockinterval.com/project-updates/2015/10/15/user-experience-in-virtual-reality
  4. Ashworth, J. (2014). Sony: Five guidelines for effective VR design. GamesIndustry.biz. Retrieved 5 January 2017, from http://www.gamesindustry.biz/articles/2014-08-12-sony-five-guidelines-for-effective-vr-design
  5. Benson, J., Olewiler, K., Daniels, J., Knoop, V., & Wirjadi, R. (2017). An Experience Framework for Virtual Reality. Medium. Retrieved 6 January 2017, from https://medium.com/@Punchcut/an-experience-framework-for-virtual-reality-f8b3e16856f7#.pnmjenw89
  6. Benson, J., Olewiler, K., Daniels, J., Knoop, V., & Wirjadi, R. (2016). Design Insights for Virtual Reality UX. Medium. Retrieved 6 January 2017, from https://uxdesign.cc/design-insights-for-virtual-reality-ux-7ae41a0c5a1a#.uvurdy8as
  7. Bergström, I., Kilteni, K., & Slater, M. (2016). First-Person Perspective Virtual Body Posture Influences Stress: A Virtual Reality Body Ownership Study. Plos One. Retrieved 5 January 2017, from http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0148060
  8. Brock, T. (2014). Tips for learning new technology (or anything, for that matter!). Bizjournals.com. Retrieved 2 January 2017, from http://www.bizjournals.com/bizjournals/how-to/technology/2014/08/10-steps-to-learn-new-technology.html
  9. Burdette, M. (2015). The Swayze Effect. Storystudio.oculus.com. Retrieved 2 January 2017, from https://storystudio.oculus.com/en-us/blog/the-swayze-effect/
  10. Carson, E. (2016). 5 tips for giving your colleagues a killer virtual reality demo — TechRepublic. TechRepublic. Retrieved 5 January 2017, from http://www.techrepublic.com/article/5-tips-for-giving-your-colleagues-a-killer-virtual-reality-demo/
  11. Ching, T. (2017). The Concept of Presence in Virtual Reality. Medium. Retrieved 2 January 2017, from https://medium.com/@choongchingteo/the-concept-of-presence-in-virtual-reality-6d4332dc1a9c#.t9c98184n
  12. Chung, T. (2017). Making Sense of Skyboxes in VR Design — AOL Alpha. Medium. Retrieved 7 January 2017, from https://medium.com/aol-alpha/making-sense-of-skyboxes-in-vr-design-3e9f8fe254d3#.8ejx8ovc0
  13. Denis, J. (2015). From product design to virtual reality — Google Design. Medium. Retrieved 3 January 2017, from https://medium.com/google-design/from-product-design-to-virtual-reality-be46fa793e9b#.kdoq0dfjg
  14. Developer Center — Documentation and SDKs | Oculus. (2017). Developer3.oculus.com. Retrieved 6 January 2017, from https://developer3.oculus.com/documentation/intro-vr/latest/concepts/bp_intro/
  15. Explorations in VR. (2017). Retrieved January 20, 2017, from Leap Motion Developer, https://developer.leapmotion.com/explorations#110
  16. Google. (2016). Designing for Daydream — Google I/O 2016. Retrieved from https://www.youtube.com/watch?v=00vzW2-PvvE
  17. Hopkins, C. (2015). Designing For Virtual Reality. Ustwo.com. Retrieved 7 January 2017, from https://ustwo.com/blog/designing-for-virtual-reality-google-cardboard/
  18. Hunter, A. (2016). Get started with VR: user experience design. VRINFLUX. Retrieved 5 January 2017, from http://www.vrinflux.com/the-basics-of-virtual-reality-ux/
  19. Hunter, A. (2015). The user is disabled: solving for physical limitations in VR. VRINFLUX. Retrieved 7 January 2017, from http://vrinflux.com/the-user-is-disabled-solving-for-physical-limitations-in-vr/
  20. Hunter, A. (2017). Reducing cognitive load in VR. Virtual Reality Pop. Retrieved 2 January 2017, from https://virtualrealitypop.com/reducing-cognitive-load-in-vr-d922ef8c6876#.7tbyjs4mg
  21. Introduction to best practices. (2016). Retrieved January 20, 2017, from Oculus, https://developer3.oculus.com/documentation/intro-vr/latest/concepts/bp_intro/
  22. Jerald, J. (2016). The VR Book: human-centered design for Virtual Reality (ACM Books) (1st ed.). [Place of publication not identified]: Morgan & Claypool Publishers.
  23. Kelly, L. (2017). Your brain on VR: The psychology of virtual reality. Form for Thought. Retrieved 5 January 2017, from http://formforthought.com/psychology-of-virtual-reality-vr-design/
  24. Laatsch, B., Northway, S., Fitterer, D., & Van Welden, D. (2016). Steam Dev Days VR Dev Panel. Retrieved from https://www.youtube.com/watch?v=gJw5GQmEETg
  25. Northway, C. (2016). Menus Suck. Gdcvault.com. Retrieved 6 January 2017, from http://www.gdcvault.com/play/1023668/Menus
  26. Pereira, A., Wachs, J., Park, K., & Rempel, D. (2015). A User-Developed 3-D Hand Gesture Set for Human–Computer Interaction. Human Factors, 57(4), 607–621. http://dx.doi.org/10.1177/0018720814559307
  27. Ravasz, J. (2016). Design Practices in Virtual Reality. uxdesign.cc — User Experience Design. Retrieved 3 January 2017, from https://uxdesign.cc/design-practices-in-virtual-reality-f900f5935826#.97d2rniip
  28. Rempel, D., Camilleri, M., & Lee, D. (2014). The design of hand gestures for human–computer interaction: Lessons from sign language interpreters. International Journal Of Human-Computer Studies, 72(10–11), 728–735. http://dx.doi.org/10.1016/j.ijhcs.2014.05.003
  29. Rose, D. & Gravel, J. Technology And Learning: Meeting Special Student Needs (1st ed.). Wakefield, MA: National Center for Universal Design for Learning. Retrieved from http://www.udlcenter.org/sites/udlcenter.org/files/TechnologyandLearning_1.pdf
  30. Samsung VR Content Production Guidelines, Version 2.0. (2017). Samsungvr.com. Retrieved 6 January 2017, from https://samsungvr.com/portal/content/content_prod_guide
  31. Shanmugam, P. (2015). UX & Virtual Reality — Designing for interfaces without Screens. Uxness.in. Retrieved 6 January 2017, from http://www.uxness.in/2015/08/ux-virtual-reality.html
  32. Staff, C. & Cortes, L. (2016). The UX of VR. Creative Bloq. Retrieved 6 January 2017, from http://www.creativebloq.com/ux/the-user-experience-of-virtual-reality-31619635
  33. Teaching and Learning Laboratory (TLL). RES.TLL-01 Guidelines for Teaching @ MIT and Beyond. Spring 2016. Massachusetts Institute of Technology: MIT OpenCourseWare, https://ocw.mit.edu.
  34. West, T. (2015). UX pointers for VR design. Medium. Retrieved 5 January 2017, from https://medium.com/@timoni/ux-pointers-for-vr-design-dd52b718e19#.di5nj4q4a
  35. Jagnow, Rob (2017). Daydream Labs Locomotion. Retrieved 17 August 2017, from https://www.blog.google/products/google-vr/daydream-labs-locomotion-vr/
One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.