Using decals to drive object shaders and post-processing.

Raphael horion
8 min readAug 2, 2023

--

Possible usage of the system presented in this article.

Introduction

For our final year student project at the CNAM Enjmin, we embarked on an exciting venture: an art gallery experience where paintings come to life in the real world. Our team was composed of motivated people, each contributing their unique experience to bring this vision to fruition:

  • Rachel Dufossé: Art Director, blending traditional and digital arts to provide a compelling visual direction.
  • Camille Huynh: Technical Artist, with experience in reproducing paint effects to support the project’s technical aspects.
  • Antoine Rey: Game Designer, infusing art history knowledge into the experience.
  • Lucile Thierry: Game Designer, adding creative insights to the project.
  • Nino Fouray: Gameplay Programmer, responsible for the immersive mechanics.
  • Gaspard Broussaud: Sound Designer, crafting an audio landscape to complement the visuals.
  • Christopher Norton: UX/UR Specialist, ensuring a seamless user experience.
  • Téo Gaillard: Producer, overseeing development to meet deadlines effectively.

And I, Raphael Horion : Graphics Programmer / Technical Artist

With Camille and Rachel, we started to list the effects we may want in our project :

  • Oil Paints
  • Pencil Sketch
  • Fluid effect (dripping)
  • Perspective shift
  • Distortion
  • Ink
Reference images used for the art direction.

Following discussions with the game designer, it became evident that their vision centered around providing players with precise control over the various effects. However, we faced a significant challenge during the implementation. To achieve these effects, we discovered that a combination of object shaders and post-processing shaders was necessary. Integrating these shaders seamlessly into the gameplay proved to be a complex and demanding task.

Unreal VS Unity

Initially, for learning purposes, we decided to develop our project using Unreal Engine, even though most of us had a background in Unity. To ensure a smooth transition, we began with a discovery phase, exploring Unreal Engine’s capabilities.

During this phase, I stumbled upon an exciting feature in Unreal Engine — the decal buffer accessible in the shader editor. I was intrigued by its potential and saw an opportunity to use decals for creating masks to enhance post-processing effects. However, despite our efforts, we encountered a roadblock in rendering a decal solely in the decal buffer without affecting the output color buffer. We realized that modifying the Unreal Render Pipeline required recompiling the engine, which presented a significant challenge.

Given the complexity of this task and the learning curve associated with customizing the Unreal Engine, we made the decision to return to our comfort zone and switched back to Unity. In Unity, we started a new project with a custom render pipeline based on the High Definition Render Pipeline (HDRP), that we customized along the way. This allowed us to tailor the rendering process to our specific needs.

Implementation

The first thing we have done to the renderpipeline is to alter the Decal renderpass by adding a copy buffer instruction at the end of it. The copy destination was a custom Render Texture we created in the project.

This Render Texture was getting a copy of our Decal Buffer every frame.

Altered High Definition render pipeline.

By sampling the decal buffer using screen space coordinates in our shader, we could extract the necessary information without displaying the decal directly on the screen. This approach differed from Stencil masks as it offered support for various elements :

  • Textures
  • UV
  • Procedurally generated patterns
  • Color

And way more !

All of this controlled by an object in world space !

In order to tweak those effects more easily and efficiently, even for team-members not well-versed in coding, we opted to use ShaderGraph.

ShaderGraph is a nodal shader editor available in Unity. The nodal aspect makes it easier to see what we are doing step by step, it also supports a lot of Unity specificities, likes Keywords.

We used Unity’s decal’s layers to prevent our effect decals to be visible in the main view:

Example situation on how we used decals layer to fine tune decals visibility.
Left : without decal’s layer, Right : using decal’s layer.

Example with object shader :

Unity HDRP lit shadergraph with grid alpha clipping.
Result in game view with a preview of the decal buffer.

Example with fullscreen post-processing :

Unity fullscreen color inverter shadergraph.
Result in game view with a preview of the decal buffer.

As always when the idea is simple the execution is not, our implementation was “working” but we had a lot of bugs and problems.

The problems

Depth flickering

One of the primary issues we faced was depth flickering. When the object shader alters the alpha or vertex position, it can inadvertently modify the depth information. Consequently, after the decal pass, this creates an offset between the DecalBuffer and the actual scene, resulting in unwanted flickering :

Depth flickering in game view with a preview of the decal buffer.

To address the depth flickering issue, we implemented a solution that involved creating two new render passes: DDES_Depth and DDES_Decal. The goal was to ensure that the decal pass does not use shaders that include DDES features in their calculations, thus preventing any interference that could lead to the flickering problem.

To create those new passes, we simply copied the standard Depth Prepass and Decal Prepass and altered it, so it would use our Render Texture and render only object with DDES features. In this example, the process is simplified, in reality we needed to copy as well a lot of intermediary functions to make a divergence between the real Decal pass and the DDES Decal pass.

As ShaderGraph does not provide direct support for creating or modifying shader passes, we employed the “keyword features” approach to achieve our desired outcome. This method allowed us to enable or disable specific features in the shaders based on the keywords used, effectively controlling which calculations and effects are applied during the DDES_Depth and DDES_Decal render passes.

In this case the keyword means “Are we rendering a DDES pass ?” and if true we must NOT apply the effects.

Shadergraph example of the use of the keyword.
Fixed depth flickering in game view with a preview of the decal buffer.

Float precision

Last implementation has exposed a potential issue related to floating-point precision, which can lead to artifacts in the rendering. Floating-point precision problems occur when the calculations involve decimal numbers with limited accuracy, resulting in slight errors that accumulate during complex operations.

Example of not enough precision when doing vertex displacement.

To solve the floating-point precision issue, we created a “BetterScreenPosition” subgraph to transform world space positions accurately. This subgraph take any world position and thanks to the transformations matrices we get the corresponding position in pixel.
More details here : http://www.codinglabs.net/article_world_view_projection_matrix.aspx

“BetterScreenPosition” subgraph

Additionally, we reduced the object scale slightly to avoid errors at mask edges.

Reducing object position by a small factor, and then we get the corresponding screen position using “BetterScreenPosition”.

UV Data Transfer

Our system’s strength lies in transmitting complex data, like UVs. With this capability, we can create holes between different renderers using decal UVs, enabling seamless blending of meshes.

Example of hole from texture on 2 distinct meshes.

But if we use the fade option of the decal, we create a distortion in the UVs.

Example of UV distortion when using the fade option on the decal projector.

To avoid the distortion, we use the blue channel to store a reference value, then we divide the received UV by the reference value to get the correct UV and decal fade :

Example of shader with UV correction.
Result in game view

Shadowing

As you may have seen, the current status of the DDES does not support shadowmapping:

Illustration of the shadow mapping issue.

It would request additional passes to send the correct information to the RenderShadowMaps pass.

Path for improvements

Right now the DDES is just an experiment made for a student project, we are proud of it and used it in our game but this idea as a lot of potential and therefore there is still plenty of work to do.

RTHandles instead of Render Texture

In scriptable render pipeline, Unity uses RTHandles, it’s an abstraction like a pointer to a texture and Unity manage the memory for us. For debug and ease of conception in our project, we used Render Textures instead of RTHandles meaning we lose a lot of performance by copying textures. We should have used RTHandles and make it accessible in shaders trough global properties.

Demonstration

Canyon demo :

Demonstration of the DDES in a Canyon context. DDES is used to apply a post-processing outline on an object and modify the alpha of two distinct objects.

XRay demo:

Demonstration of the DDES with three XRay machine showing the core features of the system : Post-Processing, Alpha, Vertex displacement.

Thank you for reading this article ! If you have any question or suggestions, feel free to contact me (I’m on Twitter and LinkedIn).

The sources of this project will not be available as it was developed for a student project and so it should not be in any other project for your own good. I hope the article is rich enough, so you can create your own version, if not contact me !

I spent a lot of time working on this, if you do make your own implementation be kind and reference to this work.

Thanks

First of all, huge thanks to my “Lost In Depiction” team for the project and the support. It was a great experience, all thanks to the beautiful people working around me.

Thanks to Rachel Dufossé for her awesome art direction and her colorful mind, always giving us new challenges that were new learning experiences.

Special thanks to Camille, with whom we fought against the Unity’s render system for days, and also for bringing beautiful effects to use with the DDES.

Last, thanks to people that helped me write this article and reviewed it :

  • Aurelien Agnès
  • Alex Morisse
  • Gabriel Paumelle
  • Taslim Guerroumi
  • Camille Huynh
  • Théophile Lauseig

Useful links

--

--

Raphael horion

Freelance Technical Artist and Graphics Programmer @ Albyon