Tips for Authoring 3D Art Content for VR

Daniel Rose
GameTextures
Published in
10 min readJul 16, 2019

The act of putting on a VR headset is the act of stepping through a portal to a new reality. In this new reality, you are the super hero though physical action and awareness. You can see and experience new and awe inspiring things on a regular basis. You can be transported to a meticulous recreation of a real place on Earth, and feel that it’s real as you move though it. Then, when you are ready, you simply remove a headset and you return to our reality.

The amount of effort and knowledge needed to craft these experiences requires its own portal. When you step through the portal and come out the other-side you emerge in a reality where modern artistic techniques and practices are merged with the optimizations and practices of the past.

Welcome. To VR Development.

While this training is geared towards Unreal Developers, Unity has it’s best practices in a nice tutorial here: https://learn.unity.com/tutorial/vr-best-practice

VR is an interesting platform for developers of all types. Game developers have an interest in VR because it opens up new ways to immerse players in the game playing experience. Simulation developers now have a more realistic way to insert people into combat, search & rescue, and surgical simulations without having the need to build out entire ‘fake’ training centers. Therapist have a new tool to have users confront fears or find a meditative space without any physical dangers to their patients. Advertising agencies can create installations that give users a curated and specific product experience, or they can create a simple phone app that requires nothing more than a headset adapter and 5 minutes of your time.

Mass adoption or not, VR is here to stay this time.

Developing art for VR is a bit different from developing for your traditional PC or Console platform. There are a number of additional factors that need to be taken into account due to the unique mixture of how users interact in VR and the ways in which VR scenes are rendered. Game engines and their tools can help mitigate some of the issues though clever tricks and optimizations (like instanced stereo rendering) but those adjustments alone are not enough to truly deliver a proper VR experience. A good experience is perhaps more necessary in VR than in traditional video games. While motion sickness can happen in games, it WILL happen in a poorly optimized VR experience running at sub-optimal frame rates. All of this means that artists are required to have a technical understanding of performance metrics at all times during VR development, as well as the general knowledge of how to develop extremely performant and readable art.

I recently toured the USS Arizona Virtually in Hawaii and the abysmal framerate was all I was thinking about, not the ship. It briefly induced motion sickness, and I don’t get motion sick.

For example, with VR, you can’t really ‘decimate’ a high poly a few times and rely on LODs to do everything else. You have to take care in how everything is made.

We at GameTextures are in a unique position to share some of the knowledge that our team has when it comes to VR development. Many of our customers use our materials in VR games, and we get questions from time to time about adopting our Unity and Unreal toolkits (ToolKit and SHADE respectively) to VR workflows. Instead of re-skinning ToolKit and SHADE into VR versions (which requires bandwidth and manpower to be diverted from our main goal of making amazing materials for all), we have opted to share some best practices for authoring art content for VR platforms and sharing how our toolkits can help with that.

Here is the short version: Think of VR as developing for mobile. PC headsets can handle a few more features (depending on user hardware) but everything else needs to be treated as a mobile part, including the shiny new Oculus Quest.

Art for VR: Terms

VR development especially, if part of a distributed team, requires the artist to have a pretty solid understanding of the performance and technical side of the art they are making. Before I start delving into to tips and tricks, I wanted to share a short list of terms that you’ll see often as it relates to VR.

  • Poly/Tri Count: Amount of triangles in a frame/scene
  • Draw Calls: The amount individual objects being drawn per a frame
  • Frame Rate: The rate that a new frame is drawn to the screen, measured in frames per a second for most gamers and in milliseconds for developers. VR works best at 90 frames a second, or 8.33 milliseconds of render time per a frame.
  • Forward Rendering: A type of rendering done on the GPU that, in short, renders each object in relation to each light (and so on)
  • Deferred Rendering: A type of rendering done on the GPU that, in short, renders out passes to an image buffer (G Buffer) that are then used to compile a completed scene.
  • Dynamic Lighting/Shadows: Lighting and Shadows that are rendered and updated in real time through a variety of means.
  • Baked Lighting/Shadows: Lighting and Shadows that are pre-computed and saved to a ‘light map’
  • Light Map: A texture or set of textures that store lighting data.
  • LOD: A simplified version of a model in a game. They can be generated many different ways (manually, auto in your DCC Application, using a 3rd party software, and in some cases in engine).
  • HLOD: Hierarchical LOD model. HLODs work like LODs, but on a larger scale. LODs work on a micro level (per a model). HLODs combine multiple models into a single model (that may then have it’s own LODs). The HLOD is viewed from a distance and it is swapped out for the individual models as players get closer. This works to both reduce Tri Counts and to reduce Draw Calls.
  • Impostors: Impostors are super simplified models with all of their material and lighting information baked down into a single texture. They are viewed from (often) an extreme distance and are used to reduce triangle count, material and mesh draw calls, and lighting calculations.
Our good friends at Substance did an interview in 2017 with Pierre Bosset. Check it out! https://www.substance3d.com/blog/texturing-epic-games-robo-recall-substance-painter-vr-workflows

Authoring Tips

  1. Be familiar with how your game engine batches draw calls and build with that in mind. Game engines can often vary in how they handle draw calls. In the older versions of Unity, it was often suggested that objects be separated by materials. This allowed the engine to handle object batching with less overhead. Unreal Engine, prior to 4.22, batched objects based on individual mesh parts and not based on materials. This meant that when I was using the same 4 materials on 60 different meshes in the scene (thinking this meant I would have only 4 draw calls), I still ended up with 60 different draw calls. A lot of this may also come down to the renderer type as well (forward or deferred).
  2. With #1 in mind, combine objects (if batched based on meshes) that are close together to reduce mesh draw calls. Mesh draw call numbers aren’t ever exact, but you want to try to keep the number under 1000 (I’ve often seen 600 being cited as ideal) for most high end VR platforms. When draw calls hit high numbers, the CPU overhead becomes a problem as it’s trying to tell the GPU what to draw. Since we target such high frame rates in VR (usually 90 frames a second) we need every spare millisecond as we can get.
  3. Materials should be as simple as possible. Materials in AAA console games today are often very complex mixed images, generated patterns, and some degree of procedural inputs. These materials may be blended with other materials at runtime through the use of masks to create the final shaded result seen in game. When blended in this way, artists and designers have an incredible amount of control over the look at almost any point in game, but it comes at a performance cost. In Unreal Engine, this is a (grossly simplified) explanation of the layered material system, which was born out of Paragon’s material workflow. It works great and allows for awesome and highly customizable materials. For VR though, this can be an expensive proposition. In order to reduce the number of materials drawn on an object at once, it may be better to take this blending process ‘offline’ as it were, and move it to a tool like Substance Painter. With tools like Substance Painter, it’s now easier than ever to put all the various materials and levels of grime you need into a single texture set for an object. This doesn’t mean tiling materials and trim sets shouldn’t be used. Large scale environment assets rely on these types of maps! All this really means is that you should be very choosy when using advanced materials and good, old fashioned unique texture sets. One size doesn’t fit all, especially in VR.
  4. Efficient models with multiple LODs are your friend. On the whole, modeling for VR isn’t terribly different from creating props or assets for any other game. The main thing to keep an eye on when modeling for VR is that your asset, depending on it’s usage, should be able to be seen from all angles. Unlike a third person or first person game, where it’s often a good idea to limit the amount of interactivity that players have with the world, VR titles thrive when presenting a reality that allows players to pick up objects or hold guns and view them as if they were real objects. After all, it’s a new reality! We want to experience it in as human a way as possible. This means that artists should keep an eye on low poly AO bakes and be sure they don’t have shading in areas that don’t make sense when held or examined. From a performance standpoint, LOD models have made a huge comeback in the PS4/Xbox One generation and that is doubly important for VR. Multiple LOD models per an asset are almost always recommended to keep performance in check, and HLOD or Impostor models can help further improve performance by reducing triangles drawn on screen and, more critically, draw calls.
  5. Bake all the lighting you can. Lighting and shadows in real time are handled both dynamically (casting shadows at run-time) and statically (lights and shadows are pre-baked to a texture called a light map). Both types of lighting can often work together to create the final lighting solution you see in a video game, depending on the type of game and performance target. A great example of this would be a game where a flashlight is casing a very detailed and defined shadow, while the rest of the environment has softer, immobile light and shadow detail. Generally speaking, dynamic lights aren’t terribly expensive on their own. These lights become expensive once shadow casting is enabled. Uncharted: Golden Abyss is a handheld game, but it made use of dynamic lighting in part because the team used smart optimizations and clever tricks to get realtime shadows to work well. Additionally, in Unreal Engine 4.22, ray traced shadows from a sun light may end up CHEAPER than traditional Cascaded Shadow Maps used for dynamic shadows. So, what does this mean when developing for VR? VR, as mentioned before, should be looked at more like a mobile part. This means that you should bake all the lighting you can, regardless of platform. With proper optimizations (shadow proxies, limited objects casting shadows at once, small shadow casting range) using some real time light in VR can work on high end devices. Overall though, plan to bake the majority, if not all, of your lighting.
  6. Use the tools that your game engine of choice gives you. One of the easiest optimizations to make is to simply read the documentation of your tools and follow their best practices. It’s a simple step that is easy to overlook. For instance, without reading the documentation, many would forget to swap Unreal Engine over to it’s Forward Rendering path. In Unreal, forward rendering is faster for the use cases that VR represents and is thus highly suggested. Unity has it’s own legacy forward renderer, and as of 2019.1 the Lightweight Render Pipeline is ready for production use which brings with it a number of additional enhancements not seen in the traditional forward renderer.
This talk features some amazing tips for creating art and working within the performance limitations of VR (6:44 on).
This video covers more of the optimization side of VR work (15:00 on).

GameTextures Materials and VR

Now that we’ve gone over some of our tips on developing art content for VR, we wanted to take a moment and share how you can use our materials to build your VR experience.

Our materials always have the option to download bitmap files (or you can output them yourself from Substance). These files can be used in any VR title like any other texture you would use. We suggest you remain cognizant of the memory limitations of your chose VR platform and use map sizes that make sense in that area.

Many of our materials are best used as a base materials. In traditional projects, this means that advanced shading and vertex blending in your game engine between our materials will work great. In VR, this isn’t the most performant. While there will always be exceptions (terrain and large environment assets are common), we suggest you take advanced materials and blending out of the engine and into a program like Substance Painter, Alchemist, or even Photoshop, and build your new material setup there. Then bring the final bitmaps over to your game.

GameTextures had two toolkits available for Unreal Engine and Unity; SHADE and ToolKit respectively. SHADE supports VR out of the box, although we suggest confirming that High Quality Reflections and Planar Reflections are checked on in the Forward Rendering section of our Master Material. SHADE offers support for Tessellation and Parallax Occlusion Mapping. For VR projects, Tessellation should be used extremely rarely, if at all, due to it’s intensive performance. Parallax Occlusion Mapping often reads poorly in VR and is also fairly performance intensive on mid range PCs, so we suggest ignoring all POM in VR projects.

ToolKit on the other hand is in a unique spot. The team has been busy working on getting the latest version of our website built and supporting current customers at the same time. Unfortunately, we’ve been unable to test ToolKit in the production ready version of Unity’s Lightweight Render Pipeline. We do believe it *should* work, but because we’ve been unable to test it properly, we suggest using a more traditional approach (standard shaders) to VR in Unity for now.

VR is an exciting and unique opportunity for game developers, and developers in general, to get in on the ground floor of technology that many in the world still have not experienced. VR obviously does great things for games, but it also has shown it’s worth for trade show demos, training simulations, and mental health treatments. More and more large companies and research Universities are looking at VR (and AR) as a viable solution to numerous problems as they relate to the intersections of costs and immersion. Those teams need VR focused developers who can create performance focused art and designs. Hopefully our tips can help you take the next step towards developing gorgeous state of the art VR projects.

’member the stervidency windows!

--

--