Part 1: Setting up our Post-Processing

Erikkubiak
13 min readDec 14, 2022

--

In this serie of articles, we focus on how to reproduce a medieval drawing Post Processing in Unity using URP and Custom Render Feature, some C# code, some shader code and a lots of magic 👨‍🎨

Part 1: Setting up our Post-Processing

In this first part, we will setup the broad lines of the post process: writing the CPU logic in C# and configure Unity and URP for that. It will allow us to continue on a strong and solid base for the other parts. After that, we will just need to modify the shader.

Thus, we will go through multiple steps with, I hope, great explanations 😉
- Introducing the concept of Scriptable Render Feature & Pass to ask for rendering actions using C#
- Configuring Unity to use it
- Allocating textures on the GPU
- Applying computation on the textures and “modifying” the camera color
- Writing the structure of our amazing shader

1. Installing URP

First of all, we will need URP in our project to extend the rendering pipeline. If you created the project with the URP template that’s perfect for you. Else I advise you this official Unity video which is really well explained.

2. Creating a new Rendering Feature

First of all, what is a Render Feature (URP Doc):

A Renderer Feature is an asset that lets you add extra Render passes to a URP Renderer and configure their behavior.

As stated above, it is a really powerful feature as you can totally customize how the rendering is done and in particular extend it with top performances. Here, you can find an amazing template from Alexander Ameye. But let’s write our own and understand what’s going on 💪 It is just a standard C# script like this:

public class GravurePPRenderFeature : ScriptableRendererFeature
{
[Serializable]
public class PassSettings
{
// ...
}

[SerializeField] private PassSettings m_Settings = new();

private GravurePPPass m_Pass;

public override void Create()
{
m_Pass = new GravurePPPass(m_Settings, m_Pass);
}

public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData)
{
renderer.EnqueuePass(m_Pass);
}
}

We will now explain each part:

public class GravurePPRenderFeature : ScriptableRendererFeature

Here we just define our new class called GravurePPRenderFeature which extends ScriptableRendererFeature. The class name is the display name in the URP settings which we will need later. The inheritance allows us to use it as a new rendering feature.

[Serializable]
public class PassSettings
{
// ...
}

Here, we define the feature settings, it can be in another file but I like to keep it in the feature to remain fast when changing something. For now, it will remain empty but it will be filled in other parts. Two elements important there:

  • The Serializable attributes allows Unity and C# to serialize our class and thus save and display it.
  • The settings are a class and not a struct as it will be shared between the feature and its pass. So we need it to be passed as a reference: a class do not a struct.
private GravurePPPass m_Pass;

public override void Create()
{
m_Pass = new GravurePPPass(m_Settings, m_Pass);
}

Here, we declare our pass and create it with the settings and the previous pass if we want to clone elements (we will describe the pass later).

public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData)
{
renderer.EnqueuePass(m_Pass);
}

This is a ScriptableRenderFeature function that is called automatically which allows us to enqueue all the passes we want in our feature, in this case only our gravure post processing pass.

We have now finished our Render Feature. As you can see it is pretty straightforward:

  • A settings class
  • A pass created in the constructor
  • Enqueuing a pass in AddRenderPasses

Let’s now see the core of the effect 💪

4. Creating the Pass

Now we will create the pass we looked at above. Again, it is just a standard C# script and you can find a simple template below:

public class GravurePPPass : ScriptableRenderPass
{
private readonly GravurePPRenderFeature.PassSettings m_Settings;

public GravurePPPass(GravurePPRenderFeature.PassSettings settings,
GravurePPPass _clone)
{
m_Settings = settings;

renderPassEvent = RenderPassEvent.BeforeRenderingOpaques;
// ...
}

public override void Configure(CommandBuffer cmd, RenderTextureDescriptor cameraTextureDescriptor)
{
base.Configure(cmd, cameraTextureDescriptor);
// ...
}

public override void OnCameraSetup(CommandBuffer cmd, ref RenderingData renderingData)
{
base.OnCameraSetup(cmd, ref renderingData);
// ...
}

public override void Execute(ScriptableRenderContext context, ref RenderingData renderingData)
{
// ...
}

public override void OnCameraCleanup(CommandBuffer cmd)
{
base.FrameCleanup(cmd);
// ...
}
}

It is a bit more complicated with multiple functions, so we will explain each part:

public class GravurePPPass : ScriptableRenderPass

We create our pass called GravurePPPass that inherits from… 🥁 … ScriptableRenderPass. Yeah we could have expected that.

public GravurePPPass(GravurePPRenderFeature.PassSettings settings,
GravurePPPass _clone)
{
m_Settings = settings;

renderPassEvent = RenderPassEvent.BeforeRenderingOpaques;
// ...
}

Here we have the base constructor:

  • First we store the feature settings in a variable to be able to reuse it later
  • Then we say when the pass should be executed. Here I ask Unity to execute the pass just before rendering the Opaques.
public override void Configure(CommandBuffer cmd, RenderTextureDescriptor cameraTextureDescriptor)
{
base.Configure(cmd, cameraTextureDescriptor);
// ...
}

The Configure function is called right before the pass is rendered. It is often used to allocate some temporary textures.

Information: As first instruction, we call the same function on base. It is the equivalent to super in other languages. We call the parent class function.

public override void OnCameraSetup(CommandBuffer cmd, ref RenderingData renderingData)
{
base.OnCameraSetup(cmd, ref renderingData);
// ...
}

This function is called right before a pass is rendered with a Camera. So it is almost the same than Configure. But, as you can see the parameters are different. Here we often change the render based on the camera, configure the clearing, asking for some special stuff such as normal buffer, etc.

public override void Execute(ScriptableRenderContext context, ref RenderingData renderingData)
{
// ...
}

Execute is the core of our pass. In it, we can give commands to the rendering pipeline to render custom objects, execute a process on an image etc.

public override void OnCameraCleanup(CommandBuffer cmd)
{
base.FrameCleanup(cmd);
// ...
}

This method is called just after the rendering and is useful to release any memory allocated.

Now that you understand a bit more how Render Features and Pass works, we will translate our schema to code.

5. Setting up our empty post processing

Let’s grab our schema from the introduction:

As you can see, we are almost doing the same on every step:

  • Taking an image as the source
  • Applying a process on it
  • Recycle the result as another input

So we need an operation that takes an image, do some computation and store it into another texture. Let’s get into the next part 😉

5.1 Blit? What is it?

Copies source texture into destination render texture with a shader.

This is a definition from the old Graphics documentation of Unity. It describes perfectly what we want: taking a texture, applying some computation on it (here a shader) and storing the result in another texture. So where and how do we do a Blit? In the Execute function we saw earlier, as it is part of the rendering process! Here is a usage:

Blit(cmd, source, destination, material, passIndex); 

Let’s dive into the parameters:

  • cmd is the command buffer we will see in the next part.
  • The source image that will be sampled in the shader.
  • The destination image used to store the result.
  • The material which represent a standard Unity material, so just a shader executed on a texture. It is optionnal so if null or not given a simple copy will be done.
  • The pass index to execute in the material. This one is also optional so if it is -1 or not provided, it will execute the first pass of the material.

If you want more information about Blit, I found this while writing this article and it gives some historic meaning of the name.

5.2 Using the Command Buffer

If you remember we need a Command Buffer as the first parameter of the Blit. But before creating it, let’s define. A command buffer is just a list of commands. It is used to enqueue instructions that will be sent to the GPU to be executed. But how do we declare it in our Execute method?

public override void Execute(ScriptableRenderContext context, ref RenderingData renderingData)
{
CommandBuffer cmd = CommandBufferPool.Get();

cmd.Clear();

// Enqueuing instructions

context.ExecuteCommandBuffer(cmd);

cmd.Clear();
CommandBufferPool.Release(cmd);
}

Here you have all the instructions you need to have a working Command Buffer:

  • First, we get the command buffer from a Pool to keep the best performance. If you are not familiar with pooling you can check this tutorial that does it for GameObjects in Unity.
  • As we use a pooled object, we make sure that it is valid so we remove all the command that may be in it. It is not mandatory, but it can save you some debugging time 😉
  • After, you can add all the instructions to it.
  • We ask the rendering context to execute it on the GPU.
  • Finally, we clean up our mess by clearing our command buffer (for other renderer features) and releasing it to the Pool.

5.3 Allocating temporary textures

Now you know how to get a command buffer and use a Blit on textures…Yeah, we don’t know how to allocate the textures yet, so let’s see how we do that 💪 First we will see how to do it with only on one texture but we will need three different ones in the end.

First of all, let’s declare the variables representing one texture:

private int m_BufferId = Shader.PropertyToID("_BufferName");
private RenderTargetIdentifier m_BufferRT;
  • First we have the ID we declare using the Shader hashing function. The parameter is important as it will allow you to access it directly in the shader if needed. I also advise to start it with ‘_’ to follow Unity shaders naming conventions.
  • We also need an identifier that we will pass to the Blit method.

Next step is to allocate it in the Configure method:

public override void Configure(CommandBuffer cmd, RenderTextureDescriptor cameraTextureDescriptor)
{
base.Configure(cmd, cameraTextureDescriptor);

cmd.GetTemporaryRT(m_BufferId, cameraTextureDescriptor, FilterMode.Bilinear);
m_BufferRT = new RenderTargetIdentifier(m_BufferId);
}
  • We ask the command buffer to allocate the texture, so it has its “name”, the descriptor and finally the FilterMode (Point, Bilinear or Trilinear)
  • Then we link our Render Target Identifier to it.

The descriptor is just all the properties of the texture such as sizing, format etc. Here we use the default camera texture descriptor but feel free to look upthe other possibilities.

Now we can use the texture, but for any memory allocated we need to release it, so let’s do it in OnCameraCleanup:

public override void OnCameraCleanup(CommandBuffer cmd)
{
base.OnCameraCleanup(cmd);
cmd.ReleaseTemporaryRT(m_BufferId);
}

As said before, we will need 3 textures in this project: one to store the object outline result, one for the shadow outline result and the final one is just a temporary one. We need it because we can’t read and write into the same texture on the GPU. Here are the variable declarations, you can easily adapt the code above for the three textures:

private int m_ObjectOutlineBuffer = Shader.PropertyToID("_ObjectOutlineBuffer");
private RenderTargetIdentifier m_ObjectOutlineRT;

private int m_ShadowOutlineBuffer = Shader.PropertyToID("_ShadowOutlineBuffer");
private RenderTargetIdentifier m_ShadowOutlineRT;

private int m_TempCameraBuffer = Shader.PropertyToID("_TemporaryCameraBuffer");
private RenderTargetIdentifier m_TempCameraRT;

5.4 Setting up all the steps

Now we have all the tools to execute the different steps so let’s do this in the Execute method:

public override void Execute(ScriptableRenderContext context, ref RenderingData renderingData)
{

// 1.
if (renderingData.cameraData.cameraType != CameraType.Game)
{
return;
}

// 2.
var target = renderingData.cameraData.renderer.cameraColorTarget;

// 3.
CommandBuffer cmd = CommandBufferPool.Get();
cmd.Clear();

// 4. Object Outline
Blit(cmd, target, m_TempCameraRT, m_DrawMaterial, 0);
Blit(cmd, m_TempCameraRT, m_ObjectOutlineBuffer, m_DrawMaterial, 2);

// 5. Shadow Outline
Blit(cmd, target, m_TempCameraRT, m_DrawMaterial, 1);
Blit(cmd, m_TempCameraRT, m_ShadowOutlineBuffer, m_DrawMaterial, 2);

// 6. Combining
Blit(cmd, m_ObjectOutlineBuffer, m_TempCameraRT, m_DrawMaterial, 3);

// 7. Blur
Blit(cmd, m_TempCameraRT, m_ObjectOutlineBuffer, m_DrawMaterial, 4);

// 8. Coloring
Blit(cmd, m_ObjectOutlineBuffer, m_TempCameraRT, m_DrawMaterial, 5);

// 9. Writing back to camera color
Blit(cmd, m_TempCameraRT, target);

// 10.
context.ExecuteCommandBuffer(cmd);
cmd.Clear();
CommandBufferPool.Release(cmd);
}

Here we have all the steps to make the post process work on the CPU side, it is a bit dense so let’s break it down:

  1. First, we are checking that the Camera used to render is one of the game camera and not the viewport camera. You can do it in the viewport if you want but it would be kind of horrible to navigate in it with our PP. It could be encapsulated in a conditional macro with UNITY_EDITOR constant. The post processing will just show up in Game View and not in Scene View now.
  2. We are storing the camera color identifier to easily access it later.
  3. As stated in a previous part, we get a command buffer and clear it.
  4. There we are computing the Object Outline in the temporary buffer. When writing back to our object outline buffer, we execute the deformation pass.
  5. Here we are doing the same but for the Shadow Outline.
  6. After that, we combine both outlines into one texture.
  7. Now we use our Object Outline Buffer as another buffer for the blurring.
  8. We color the final texture depending on the outline
  9. We write back to the camera color saved in 2.
  10. Finally, we ask the GPU to execute everything just before clearing and releasing the command buffer.

Now you have everything to make it work, but only for the CPU. You remember we said that we need a shader to process our texture. So let’s do this 💪

5.5 Writing our shader structure

So we just said we need a shader so we will just create a Post Processing Shader also called Image Effect Shader. Let’s create it with a Assets > Create > Shader > Image Effect Shader.

Important: You should put the shader in Resources folder. Why? It will force Unity to have it in the build. That’s important because we won’t reference it directly in our project so Unity will strip it out from the build, leading to an error while running in production.

You should have this as default:

Shader "Hidden/Test"
{
Properties
{
_MainTex ("Texture", 2D) = "white" {}
}
SubShader
{
// No culling or depth
Cull Off ZWrite Off ZTest Always

Pass
{
CGPROGRAM
#pragma vertex vert
#pragma fragment frag

#include "UnityCG.cginc"

struct appdata
{
float4 vertex : POSITION;
float2 uv : TEXCOORD0;
};

struct v2f
{
float2 uv : TEXCOORD0;
float4 vertex : SV_POSITION;
};

v2f vert (appdata v)
{
v2f o;
o.vertex = UnityObjectToClipPos(v.vertex);
o.uv = v.uv;
return o;
}

sampler2D _MainTex;

fixed4 frag (v2f i) : SV_Target
{
fixed4 col = tex2D(_MainTex, i.uv);
// just invert the colors
col.rgb = 1 - col.rgb;
return col;
}
ENDCG
}
}
}

I won’t go into details about shaders but the important things to remember are:

  • The first line ‘Shader “Hidden/Test” ’, the string following the Shader keyword is really important as we will use it to reference it in our Render Pass. Here I called my shader “Test” so Unity generated it with “Hidden/Test”.
  • We need a Texture parameter, as it will represent our source image for the Blit.
  • For now, we have one Pass that is taking the color and inverting it.

Information: The Hidden before the name is pretty useful as Unity won’t show this shader when trying to assign it to a Material.

Now let’s modify the Pass to just read the texture and return it, thus it will not change anything, we are just in a setup phase after all 😉

Pass
{
Name "My Pass Name"
CGPROGRAM
#pragma vertex vert
#pragma fragment frag

#include "UnityCG.cginc"

struct appdata
{
float4 vertex : POSITION;
float2 uv : TEXCOORD0;
};

struct v2f
{
float2 uv : TEXCOORD0;
float4 vertex : SV_POSITION;
};

v2f vert (appdata v)
{
v2f o;
o.vertex = UnityObjectToClipPos(v.vertex);
o.uv = v.uv;
return o;
}

sampler2D _MainTex;
fixed4 frag (v2f i) : SV_Target
{
fixed4 col = tex2D(_MainTex, i.uv);
return col;
}
ENDCG
}

I changed two things there:

  • I removed the color inversion, so we just read the texture and return it.
  • I added a Pass Name at the beginning, it is really important as it may be valuable information when using the Frame Debugger.

Now let’s duplicate this code 6 times as we have 6 different processing to do on our images. We will also rename the Pass with:

  • Object Outline
  • Shadow Outline
  • Deformation
  • Combination
  • Blur
  • Coloring

Tada 🎉 the shader is complete, we just need to use it in our Pass.

5.6 Let’s use our shader in the Post Processing

To use our shader in the pass is pretty simple. First we have to declare a material variable:

private readonly Material m_DrawMaterial;

Then we can allocate it in the pass Constructor:

m_DrawMaterial = CoreUtils.CreateEngineMaterial("Hidden/EngravingPP");

The function parameter is just the name you have plus the Shader keyword. With the previous exemple it should be “Hidden/Test”, but I use a proper name now to have an organized and comprehensible project.

Now you can directly use it in the Blits.

6. Adding our Render Feature to the RP

Adding our new render feature to the Unity Rendering Pipeline is pretty simple. First, locate the URP settings:

  • When you create from the template the files are in Assets/Settings. Usually you have a lot of different settings for different render quality presets.
  • If you created the files yourself, just find them back. I personaly put them in Settings/Rendering folder.

Tip: If you have a doubt on which one is applied in the project, you can go into the Project Settings -> Graphics and it will be the first field.

Normally, there is two types of files:

  • Universal Render Pipeline Asset which is the Main Asset. It defines the global pipeline.
  • Universal Renderer Data that will contain our custom Render Feature. The first file may have multiple Universal Renderer Data objects. You can pick the one you want for each Camera.

Now that you know the two main files, let’s grab our Universal Renderer Data. A fresh nw one should look like this:

Universal Renderer Data Inspector

In order to add our new Renderer Feature, just click Add Renderer Feature in the bottom and choose Gravure PP Render Feature, and tada 🎉that’s it, it is now calling our Execute method. You may check with a simple Debug.Log.

Adding the Feature to the Universal Renderer Data

Conclusion

Normally, you should have the amazing result of… nothing being changed😅 Yeah I know, it can be a bit frustrating as we see no differences but the complete Pass is set up, so we will only need to change the shaders in the next part 💪 I also hope you learned some things and we will see in the next Part about Object Outlines 😄

You can follow me for more 😍

Thanks

I would like to thanks the people that helped me writing this article by reviewing or providing any help:

--

--