Sitemap

Dealing with Premultiplied alpha on iOS

4 min readFeb 20, 2020

In computer graphics, there are two different ways to represent the opacity of a color value: Straight alpha and Premultiplied alpha.

When using straight, also known as linear alpha:

- RGB values specify the color of the thing being drawn.
- The alpha value specifies how solid it is.

In this world, RGB and alpha are independent. We can change one without affecting the other. To make an object fade out, we would gradually reduce its alpha value while leaving RGB unchanged.

When using premultiplied:
- RGB specifies how much color the thing being drawn contributes to the output.
- The alpha value specifies how much it obscures whatever is behind it.

In this world, RGB and alpha are linked. To make an object transparent we must reduce both its RGB (to contribute less color) and also its alpha (to obscure less of whatever is behind it). Fully transparent objects no longer have any color at all, so there is only one value that represents 100% transparency: RGB and alpha all zero.

In digital imaging, a pixel is a physical point in a raster image or the smallest addressable element in an all points addressable display device; so it is the smallest controllable element of a picture represented on the screen.

The RGB color model is an additive color model in which red, green, and blue light are added together in various ways to reproduce a broad array of colors. The name of the model comes from the initials of the three additive primary colors, red, green, and blue.

Figure 1: Representation of an image with 3x3 pixels, with the 4 channels: Red (R), Green (G), Blue(B) and Alpha (represented in purple)

Some days ago, I got one problem during an image processing on one of our apps. It was just a simple bit manipulation inside an image, and when running the app on iOS simulator everything was fine, but on a real device, I got different values.

Real device (one iPhone XR in this case) renders the image using premultiplied alpha and the simulator uses straight. For example, for the same pixel I got this:

On Simulator: R = 254, G = 254, B = 254, A = 254
On Real Device: R = 253, G = 253, B = 253, A = 254

How to solve it? It’s not hard, but when it’s your first time dealing with it, can be. First thing, my main problem was getting the correct color of a pixel. So, I created this extension in Swift to help me:

Code 1: Swift Extension to enable get a pixel color from an UIImage

Now we have the UIColor for one exact point. But it’s not considering premultiplied alpha. Since Apple provides on CGImage a property called CGImageAlphaInfo who gives you which kind of alpha you have.

Figure 2: CGImageAlphaInfo documentation https://developer.apple.com/documentation/coregraphics/cgimagealphainfo?language=swift

And for premultiplied we have two cases:

kCGImageAlphaFirst

The alpha component is stored in the most significant bits of each pixel. For example, non-premultiplied ARGB.

kCGImageAlphaPremultipliedLast

The alpha component is stored in the least significant bits of each pixel and the color components have already been multiplied by this alpha value.

Figure 3: List with all options of CGImageAlphaInfo: https://developer.apple.com/documentation/coregraphics/cgimagealphainfo/kcgimagealphapremultipliedfirst?language=swift

In my case I had kCGImageAlphaPremultipliedLast.

Since it’s just a multiplication, that’s surprisedly easy to fix, the RGB values are multiplied by the alpha value.

Figure 4: Pseudo Code to calculate the premultiplied value, where premultiplied and straight represents a Color Object with RGBA properties

So, we need to revert this doing the opposite to get the straight color.

Figure 5: Pseudo Code to calculate straight value

After that, I create this extension to get the color of a pixel passing the CGImageAlphaInfo as parameter.

Extension to get the true RGB removing premultiplied alpha value

Fixed. Now with the straight value I could do my image processing.

This is a very simple case, but if you are more interested in image processing on iOS I recommend starting from the CoreImage documentation on:

https://developer.apple.com/documentation/coreimage

Thanks for reading.

References:

[1]https://developer.apple.com/documentation/coregraphics/cgimagealphainfo/kcgimagealphapremultipliedfirst?language=swift

[2] https://developer.apple.com/documentation/coreimage

[3] https://microsoft.github.io/Win2D/html/PremultipliedAlpha.htm

[4] https://sisu.ut.ee/imageprocessing/book/1

--

--

Ezequiel França dos Santos
Ezequiel França dos Santos

Written by Ezequiel França dos Santos

I am a mobile software developer and doctoral researcher specializing in serious games and digital learning technologies. https://ezefranca.com

No responses yet