Adventures in Wide Color: An iOS Exploration

P3’s color gamut is about 25% larger than sRGB’s
  • Set up an AVCaptureSession that streams pixel buffers from the camera, in the P3 color space, if it’s supported.
  • Created a CIContext whose workingColorSpace is Apple’s extended sRGB color space. Using the extended sRGB format is crucial because “wide” color information will be both preserved and easily identifiable after converting from P3. Unlike sRGB, which clamps values to a range from 0.0 to 1.0 and thus discards any wide-color information, extended sRGB allows values outside of that range, which leaves open the possibility that wide-color-aware displays can use them.
  • Write a Metal fragment shader that allows wide colors to pass through unchanged, but converts “narrow” colors to a shade of gray.
  • Using the CIContext and a custom CIFilter, built with the Metal shader, take each pixel buffer in the stream, filter it and render it to the screen.

Step 1: Creating the AVCaptureSession

Apple’s AVCam sample project is an excellent template for how to capture images from the camera, and I was able to adapt it for my project with few changes.

session.sessionPreset = .photo

Step 2: Creating the CIContext

It’s easy to lose wide-color information when rendering an image. As Mike Krieger of Instagram points out in this great blog post, iOS 10 introduced a piece of wide-color-aware API calledUIGraphicsImageRenderer to help with the rendering of wide-color images in Core Graphics.

private lazy var ciContext: CIContext = {
let space = CGColorSpace(name: CGColorSpace.extendedSRGB)
let format = NSNumber(value: kCIFormatRGBAh) // full-float pixels
var options = [String: Any]()
options[kCIContextWorkingColorSpace] = space
options[kCIContextWorkingFormat] = format
return CIContext(options: options)
}()

Step 3: Creating the CIFilter

The next step was building a filter to convert “non-wide” pixels to shades of gray. I decided an interesting way to do this would be to create a custom CIFilter that was backed by a Metal shader. The basic steps were:

  1. Write the Metal shader
  2. Create a CIKernel from the shader
  3. Create a CIFilter subclass to apply the CIKernel
static bool isWideGamut(float value) {
return value > 1.0 || value < 0.0;
}
namespace coreimage {
float4 wide_color_kernel(sampler src) {
float4 color = src.sample(src.coord());
if (isWideGamut(color[0])
|| isWideGamut(color[1])
|| isWideGamut(color[2])) {
return color;
} else {
float3 grayscale = float3(0.3, 0.59, 0.11);
float luminance = dot(grayscale, color.rgb);
return float4(float3(grayscale), 1.);
}
}
}

Step 4: Putting It Together

With that working, the last step was to grab each pixel buffer as it arrives, apply the filter, and thendisplay it to the screen. This involved implementing a AVCaptureVideoDataOutputSampleBufferDelegate callback method, which I set up to be called on a dedicated, serial background queue.

Up and Running

In any event, the experiment app ran very smoothly on my iPhone X: Core Image seemed more than capable of handling the 30 camera frames per second it was being asked to render. Meanwhile, I was surprised how much wide color I found in the world — even on a gray day in downtown Manhattan. Here are a few screenshots.

A green skirt, an orange bike and a yellow cab were all outside of standard RGB.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Peter Edmonston

Peter Edmonston

iOS developer at Project Franklin @ Walmart. I like rescue dogs, Brompton bikes, Swift and Asbury Park.