Dealing with Premultiplied alpha on iOS
In computer graphics, there are two different ways to represent the opacity of a color value: Straight alpha and Premultiplied alpha.
When using straight, also known as linear alpha:
- RGB values specify the color of the thing being drawn.
- The alpha value specifies how solid it is.
In this world, RGB and alpha are independent. We can change one without affecting the other. To make an object fade out, we would gradually reduce its alpha value while leaving RGB unchanged.
When using premultiplied:
- RGB specifies how much color the thing being drawn contributes to the output.
- The alpha value specifies how much it obscures whatever is behind it.
In this world, RGB and alpha are linked. To make an object transparent we must reduce both its RGB (to contribute less color) and also its alpha (to obscure less of whatever is behind it). Fully transparent objects no longer have any color at all, so there is only one value that represents 100% transparency: RGB and alpha all zero.
In digital imaging, a pixel is a physical point in a raster image or the smallest addressable element in an all points addressable display device; so it is the smallest controllable element of a picture represented on the screen.
The RGB color model is an additive color model in which red, green, and blue light are added together in various ways to reproduce a broad array of colors. The name of the model comes from the initials of the three additive primary colors, red, green, and blue.
Some days ago, I got one problem during an image processing on one of our apps. It was just a simple bit manipulation inside an image, and when running the app on iOS simulator everything was fine, but on a real device, I got different values.
Real device (one iPhone XR in this case) renders the image using premultiplied alpha and the simulator uses straight. For example, for the same pixel I got this:
On Simulator: R = 254, G = 254, B = 254, A = 254
On Real Device: R = 253, G = 253, B = 253, A = 254
How to solve it? It’s not hard, but when it’s your first time dealing with it, can be. First thing, my main problem was getting the correct color of a pixel. So, I created this extension in Swift to help me:
Now we have the UIColor for one exact point. But it’s not considering premultiplied alpha. Since Apple provides on CGImage a property called CGImageAlphaInfo who gives you which kind of alpha you have.
And for premultiplied we have two cases:
kCGImageAlphaFirst
The alpha component is stored in the most significant bits of each pixel. For example, non-premultiplied ARGB.
kCGImageAlphaPremultipliedLast
The alpha component is stored in the least significant bits of each pixel and the color components have already been multiplied by this alpha value.
In my case I had kCGImageAlphaPremultipliedLast.
Since it’s just a multiplication, that’s surprisedly easy to fix, the RGB values are multiplied by the alpha value.
So, we need to revert this doing the opposite to get the straight color.
After that, I create this extension to get the color of a pixel passing the CGImageAlphaInfo as parameter.
Fixed. Now with the straight value I could do my image processing.
This is a very simple case, but if you are more interested in image processing on iOS I recommend starting from the CoreImage documentation on:
Thanks for reading.
References:
[2] https://developer.apple.com/documentation/coreimage
[3] https://microsoft.github.io/Win2D/html/PremultipliedAlpha.htm