Using decals to drive object shaders and post-processing.
Introduction
For our final year student project at the CNAM Enjmin, we embarked on an exciting venture: an art gallery experience where paintings come to life in the real world. Our team was composed of motivated people, each contributing their unique experience to bring this vision to fruition:
- Rachel Dufossé: Art Director, blending traditional and digital arts to provide a compelling visual direction.
- Camille Huynh: Technical Artist, with experience in reproducing paint effects to support the project’s technical aspects.
- Antoine Rey: Game Designer, infusing art history knowledge into the experience.
- Lucile Thierry: Game Designer, adding creative insights to the project.
- Nino Fouray: Gameplay Programmer, responsible for the immersive mechanics.
- Gaspard Broussaud: Sound Designer, crafting an audio landscape to complement the visuals.
- Christopher Norton: UX/UR Specialist, ensuring a seamless user experience.
- Téo Gaillard: Producer, overseeing development to meet deadlines effectively.
And I, Raphael Horion : Graphics Programmer / Technical Artist
With Camille and Rachel, we started to list the effects we may want in our project :
- Oil Paints
- Pencil Sketch
- Fluid effect (dripping)
- Perspective shift
- Distortion
- Ink
Following discussions with the game designer, it became evident that their vision centered around providing players with precise control over the various effects. However, we faced a significant challenge during the implementation. To achieve these effects, we discovered that a combination of object shaders and post-processing shaders was necessary. Integrating these shaders seamlessly into the gameplay proved to be a complex and demanding task.
Unreal VS Unity
Initially, for learning purposes, we decided to develop our project using Unreal Engine, even though most of us had a background in Unity. To ensure a smooth transition, we began with a discovery phase, exploring Unreal Engine’s capabilities.
During this phase, I stumbled upon an exciting feature in Unreal Engine — the decal buffer accessible in the shader editor. I was intrigued by its potential and saw an opportunity to use decals for creating masks to enhance post-processing effects. However, despite our efforts, we encountered a roadblock in rendering a decal solely in the decal buffer without affecting the output color buffer. We realized that modifying the Unreal Render Pipeline required recompiling the engine, which presented a significant challenge.
Given the complexity of this task and the learning curve associated with customizing the Unreal Engine, we made the decision to return to our comfort zone and switched back to Unity. In Unity, we started a new project with a custom render pipeline based on the High Definition Render Pipeline (HDRP), that we customized along the way. This allowed us to tailor the rendering process to our specific needs.
Implementation
The first thing we have done to the renderpipeline is to alter the Decal renderpass by adding a copy buffer instruction at the end of it. The copy destination was a custom Render Texture we created in the project.
This Render Texture was getting a copy of our Decal Buffer every frame.
By sampling the decal buffer using screen space coordinates in our shader, we could extract the necessary information without displaying the decal directly on the screen. This approach differed from Stencil masks as it offered support for various elements :
- Textures
- UV
- Procedurally generated patterns
- Color
And way more !
All of this controlled by an object in world space !
In order to tweak those effects more easily and efficiently, even for team-members not well-versed in coding, we opted to use ShaderGraph.
ShaderGraph is a nodal shader editor available in Unity. The nodal aspect makes it easier to see what we are doing step by step, it also supports a lot of Unity specificities, likes Keywords.
We used Unity’s decal’s layers to prevent our effect decals to be visible in the main view:
Example with object shader :
Example with fullscreen post-processing :
As always when the idea is simple the execution is not, our implementation was “working” but we had a lot of bugs and problems.
The problems
Depth flickering
One of the primary issues we faced was depth flickering. When the object shader alters the alpha or vertex position, it can inadvertently modify the depth information. Consequently, after the decal pass, this creates an offset between the DecalBuffer and the actual scene, resulting in unwanted flickering :
To address the depth flickering issue, we implemented a solution that involved creating two new render passes: DDES_Depth and DDES_Decal. The goal was to ensure that the decal pass does not use shaders that include DDES features in their calculations, thus preventing any interference that could lead to the flickering problem.
To create those new passes, we simply copied the standard Depth Prepass and Decal Prepass and altered it, so it would use our Render Texture and render only object with DDES features. In this example, the process is simplified, in reality we needed to copy as well a lot of intermediary functions to make a divergence between the real Decal pass and the DDES Decal pass.
As ShaderGraph does not provide direct support for creating or modifying shader passes, we employed the “keyword features” approach to achieve our desired outcome. This method allowed us to enable or disable specific features in the shaders based on the keywords used, effectively controlling which calculations and effects are applied during the DDES_Depth and DDES_Decal render passes.
In this case the keyword means “Are we rendering a DDES pass ?” and if true we must NOT apply the effects.
Float precision
Last implementation has exposed a potential issue related to floating-point precision, which can lead to artifacts in the rendering. Floating-point precision problems occur when the calculations involve decimal numbers with limited accuracy, resulting in slight errors that accumulate during complex operations.
To solve the floating-point precision issue, we created a “BetterScreenPosition” subgraph to transform world space positions accurately. This subgraph take any world position and thanks to the transformations matrices we get the corresponding position in pixel.
More details here : http://www.codinglabs.net/article_world_view_projection_matrix.aspx
Additionally, we reduced the object scale slightly to avoid errors at mask edges.
UV Data Transfer
Our system’s strength lies in transmitting complex data, like UVs. With this capability, we can create holes between different renderers using decal UVs, enabling seamless blending of meshes.
But if we use the fade option of the decal, we create a distortion in the UVs.
To avoid the distortion, we use the blue channel to store a reference value, then we divide the received UV by the reference value to get the correct UV and decal fade :
Shadowing
As you may have seen, the current status of the DDES does not support shadowmapping:
It would request additional passes to send the correct information to the RenderShadowMaps pass.
Path for improvements
Right now the DDES is just an experiment made for a student project, we are proud of it and used it in our game but this idea as a lot of potential and therefore there is still plenty of work to do.
RTHandles instead of Render Texture
In scriptable render pipeline, Unity uses RTHandles, it’s an abstraction like a pointer to a texture and Unity manage the memory for us. For debug and ease of conception in our project, we used Render Textures instead of RTHandles meaning we lose a lot of performance by copying textures. We should have used RTHandles and make it accessible in shaders trough global properties.
Demonstration
Canyon demo :
XRay demo:
Thank you for reading this article ! If you have any question or suggestions, feel free to contact me (I’m on Twitter and LinkedIn).
The sources of this project will not be available as it was developed for a student project and so it should not be in any other project for your own good. I hope the article is rich enough, so you can create your own version, if not contact me !
I spent a lot of time working on this, if you do make your own implementation be kind and reference to this work.
Thanks
First of all, huge thanks to my “Lost In Depiction” team for the project and the support. It was a great experience, all thanks to the beautiful people working around me.
Thanks to Rachel Dufossé for her awesome art direction and her colorful mind, always giving us new challenges that were new learning experiences.
Special thanks to Camille, with whom we fought against the Unity’s render system for days, and also for bringing beautiful effects to use with the DDES.
Last, thanks to people that helped me write this article and reviewed it :
- Aurelien Agnès
- Alex Morisse
- Gabriel Paumelle
- Taslim Guerroumi
- Camille Huynh
- Théophile Lauseig
Useful links
- http://www.codinglabs.net/article_world_view_projection_matrix.aspx
- https://catlikecoding.com/unity/tutorials/scriptable-render-pipeline/
- https://docs.unity3d.com/Packages/com.unity.render-pipelines.core@16.0/manual/index.html
- https://docs.unity3d.com/Packages/com.unity.render-pipelines.high-definition@12.1/manual/Custom-Pass-Creating.html