Programmer’s Guide to Gamma Correction
If you’re rendering graphics with lighting calculations, you should understand gamma correction. This guide covers what you need to know to get correct lighting when working with sRGB images.
A monitor produces light and our eyes see that light. We call this light “linear-RGB.” In graphics, 0 is the darkest shade of grey a particular monitor can produce, and 255 is the brightest shade. You would expect 127 to be a mid-grey shade right in between those two, but this is not the case.
The reason being that our eyes do not perceive light linearly; rather, we perceive it on a curve. For survival, it’s much more useful to see well at night than during the day, so we see many more shades of grey near the dark end of the spectrum than the bright end.
We have two problems with linear light when working with computer graphics:
- Bits are wasted representing many lighter shades that look very similar
- A shade halfway between two other shades does not look halfway to our eyes
The sRGB color space addresses these problems. sRGB follows a curve that is closer to how our eyes perceive light. 127 in sRGB is what we perceive as mid-grey, and it gives more bits to darker shades and less to brighter ones. Every image you have ever seen on a computer is probably sRGB. This conversion is even done by your camera when you take a picture.
In order to convert from sRGB to linear-RGB, a gamma function is applied to each color component x of each pixel (assuming the values are normalized [0,1] rather than [0,255]): f(x) = x^2.2
This function is applied by your monitor with what is called “display gamma.” Let’s say you send pixels with brightness of 127 to your monitor: the display gamma is applied and the light emitted is equal to 55 in linear-RGB, which you perceive as mid-grey. Without display gamma, sRGB images would look washed out and too bright.
If these conversions are already done for us, why do we need to worry about gamma correction? If you’re just sending images to the graphics card, you don’t, but if you’re doing lighting calculations, then you do. Math for lighting is expecting colors to be linear. If you sample an sRGB diffuse texture and multiply it by a lighting value, the results will be exaggerated and unrealistic. To get correct lighting, first convert the sRGB color to linear-RGB when sampling a texture, then do the calculations, and finally convert the result back to sRGB before writing to the framebuffer. Converting from linear-RGB to sRGB is simply the inverse: f(x) = x^-2.2
There are two points in OpenGL that can be set up to do gamma conversions for you: textures and framebuffers. The following table shows the results for all four combinations of linear-RGB and sRGB textures and framebuffers.
Sampling an sRGB texture returns a color shifted down to linear-RGB. Writing to an sRGB framebuffer does the inverse.
Situations 1 and 4 will produce the same result if there’s no lighting, but with lighting, 4 will look much better since the color gets mapped to linear-RGB in the fragment shader. Mismatching them, as in situations 2 and 3, will result in an image that’s washed out or too dark. Remember that display gamma is applied by the monitor at the end of each of these so that 127 appears mid-grey to your eyes.
Textures used as data, such as normal maps, should not be gamma corrected when sampling. You’ll want to set these to linear-RGB, while colors the user will see should be in sRGB textures.