[Unity] Always Be Linear: Shader-Based Gamma Correction
I f you have ever developed a 3d mobile game with unity, you must have had a hard time deciding to choose between enabling the “Auto Graphics API” option to support each and every device in the market and Linear Color Space which is not supported with every graphics API (i.e. OpenGL ES 2) but gives a better-looking result. If you are developing a 2D game or you don’t care about the devices that only support OpenGL ES 2 then it’s not a problem for you, but if you want your game to work on every mobile device then you have to scarify the better looking and choose Gamma color space. The good news is that you still can get the same result of the linear rendering with gamma color space by applying gamma correction in shaders.
In this article, I’ll show you how to edit the shaders of the URP template to apply gamma correction and as a result of that you will be able to switch between linear and gamma color spaces without having to change anything in your scene or art assets and you still can get linear rendering result.
What is GAMMA
Well, there is a lot of details here but I’ll try to keep it simple. It’s all started with Cathode Ray Tube (CRT) screens, as in CRT screen the pixel intensity does not linearly proportional to the actual pixel color (brightness), lets take an example, let’s say we have four gray pixels in the screen with the following values 0, 0.25, 0.5 and 1 you will expect the intensity of the light coming from those four pixels to be in the same ratio, but in reality, if you measure the intensity of the light coming from those four pixels you will get these four value 0, 0.047, 0.218 and 1, but why?! simply because normal CRT screen has a gamma value of 2.2 so the final visualized intensity of any pixel will be equal to the actual value of that pixel raised to power 2.2
This leads to perceiving different colors than the actual pixel colors, take for example the following figure, the gradients levels on the left are the actual gray values of the image on the disk but the screen will show it as the image on the right
The solution for this problem that the pioneers came up with is to apply inverse gamma (1/2.2 = 0.45) to images before saving them on disk and when we show those images on screen the screen will apply a gamma of 2.2 so we can see the intended image colors as ((x)^.45)^2.2= x
With that being said, you should now expect that every image saved on your disk (whether it’s captured by a camera, scanned by a scanner, or created by software) is gamma-corrected, and the actual values saved on disk are different from what you see on screen, so whenever you take a photo with your camera the camera will gamma correct pixels colors by the exponential value of .45 before saving the photo and when you show this photo on your screen the screen will apply exponential gamma of 2.2 so you can get the actual colors collected by your camera.
So everything should be totally fine now, why should we care about gamma and gamma correction then?
Well, if you are developing a 2D game (with no lighting or fancy sprite color processing stuff) you shouldn’t face any issue, as the sprite images are saved in gamma space (with a gamma value of 0.45) but after it gets rendered to screen a gamma value of 2.2 will be applied and you will see the expected colors.
But if you are working on a 3d game with some lighting, you are facing a big issue, as pixels’ colors are in gamma space it doesn’t reflect the actual colors of pixels, and when we multiply the pixel color by light color we get a different result; to clarify that let's take an example with numbers:
-imagine we have a gray pixel with a value of 0.1
-and we have white light perpendicular to that pixel that has an intensity of 2
-we should expect the final pixel color(gray level) to be equal to 0.2
-but if the image is saved in gamma space (gamma-corrected) then the pixel color that is saved on the disk will be (0.1)^.45=0.355
-when we multiply this pixel by the light color we get = 0.355* 2 = 0.71
-when the display applies a gamma of 2.2 on the final rendered result we get = (0.71)^2.2=0.47
-this final color (0.47) is way brighter than the expected value (0.2)
In other words, if the image is in gamma space (gamma-corrected) the linearity calculations of lighting will be broken and the final colors will be washed out, especially on brighter areas (i.e. areas that receive more lights or has bright textures/colors), see the comparison below.
To maintain the linearity we need to follow the linear pipeline, in which we simply invert the gamma correction of the loaded texture by applying a gamma of (2.2), then we do all the light calculations (shading), and after we get the final pixel color we apply gamma correction again with a gamma value of 0.45 to negate screen gamma
So, how we can do that? well, the graphics APIs (i.e OpenGL) have some native features that give your application the ability to do that on GPU without writing the logic by yourself, simply in OpenGL, we can load the texture as GL_SRGB which will tell OpenGL to convert the pixels colors to linear color space (remove the gamma correction). And if we enable GL_FRAMEBUFFER_SRGB OpenGL will automatically perform gamma correction after each fragment shader run to all subsequent framebuffers, including the default framebuffer, so we get all final pixels gamma corrected.
That sounds easy, but what about unity? well, in unity if you set the color space to linear, unity will use these OpenGL features to convert the rendering to linear color space and you have to do nothing else.
But if you choose “Gamma” as your color space, then you have to do the previous fix by yourself, which I’ll show you how in the coming section
In this section we will explore how to apply a gamma correction in shaders, we will use the URP template for that
We are going to edit some shader files of the URP package which means all changes may get overwritten if unity re-import the package, that’s why it’s better to copy the Universal Render package folder from packages cache and paste it in your asset folder, you’ll have to change a bunch of hard-coded paths and you will have to do the same with shadergraph package as well.
To follow the following steps you can simply get the repo from github (link in resources section at the end of the article) and checkout the initial commit, you will get a clean project with URP and Shadergraph packages under asset folder
We need to modify URP shaders to apply gamma correction in it, but we can’t do that by just modifying the material textures (maps), we have to do 3 main steps:
1. Fixing maps & colors
2. Fixing GI
3. Fixing Lights Colors
And with all the steps we will use preprocessor directives to make sure our code will run in gamma space only so we can switch back and forth between color spaces in unity without having to change anything in shaders or scenes
Fixing Maps & Colors
This is the most straight forward step, as we know all textures are saved in gamma space and since we are not working on linear space unity will not remove the gamma correction for us, so we should remove gamma correction from all sampled textures in the shader before we move forward to lighting calculation stage. In URP Lit shader those textures are base map, metallic map, and occlusion map, normal map shouldn’t be touched as it hasn’t been gamma-corrected.
Also, we need to convert material colors (Albedo and Emission)to linear space, as when we work in gamma space unity will send all colors in gamma space as well
All those corrections can be done by editing one function InitializeStandardLitSurfaceData in LitInputPass.hlsl, so the new version of the function should look like this:
Now since we removed the gamma correction from imported textures and converted colors to linear space we need to apply gamma correction to the final color before we write it to the color buffer, so it will show correctly on the screen, for that we need to change LitPassFragment function which represents the fragment shader function in LitForwardPass.hlsl
Now the result looks way better than gamma space although it is still far from linear space looking:
Unity calculate baked lighting in linear space but it saves all the lighting data in gamma textures, so when those data sampled by shaders we need to remove the gamma correction from them as well, we need to edit two functions in Lighting.hlsl file, SampleLightmap, and GlossyEnvironmentReflection:
After fixing GI the result should look more realistic and closer to linear rendering:
Fixing Lighting Colors
As you may notice from the previous comparison, the color of the light looks more pronounced in the linear rendering, while it’s washed out in the Gamma corrected one, that’s basically because unity sends light color to the shader based on the current rendering color space. So the last piece of the puzzle is to make sure that the light colors are in linear space. Theoretically, we can convert the light color to linear space in the shader but actually, this is not possible because unity send the light color multiplied by the intensity to the shader and we can’t separate them apart in the shader, so the best solution is to do this fix in C# side, luckily we have access to the URP source code to do that, and actually, this is a better solution in terms of performance.
We need to add the following code after line 124 in ForwardLights.cs file:
With this fix applied we should get a very similar looking to linear rendering
- Screens’ Gamma typically ranges from 2 to 2.4 and it’s not always 2.2
- As a performance optimization, we could use a gamma value of 2 instead of 2.2, in this case, we can replace the pow(value, gamma) by gamma*gamma, and pow(value,gammaInv) by sqrt(value)
- Skybox is still rendered in gamma space as it doesn’t use the URP list shader, you can download the skybox shader from the Unity download archive and apply gamma correction to it as well if you want to get a typical rendering result