[iOS]Make Video with Alpha Channel

f.yuki
2 min readJul 24, 2021

Create a video with an alpha channel in iOS

You can create videos with transparency information in iOS by using AVAssetWriter!

The Photos app in iOS can play videos with transparency.
In this video, you can see the images behind the video because the video has an alpha channel.

Example App

Repository: https://github.com/fuziki/VideoCreator

Open Examples/NativeExamples/NativeExamples.xcodeproj

Points for saving videos with alpha channel

Specify hevcWithAlpha when instantiating AVAssetWriterInput.

let videoConfigs: [String: Any] = [
AVVideoCodecKey: AVVideoCodecType.hevcWithAlpha,
AVVideoWidthKey : 1920,
AVVideoHeightKey :1080]
let videoAssetWriterInput = AVAssetWriterInput(mediaType: .video, outputSettings: videoConfigs)

CMSampleBuffer’s CVPixelBuffer format expects kCVPixelFormatType_32BGRA.

let options = [kCVPixelBufferIOSurfacePropertiesKey: [:]] as [String : Any]             
let status = CVPixelBufferCreate(nil, width, height,
kCVPixelFormatType_32BGRA,
options as CFDictionary,
&pixelBuffer)

You can get a video with an alpha channel by generating image data with transparency information and writing it to AVAssetWriterInput.

Embed it in your project

Start Recording

I specified mov as the file type.
I think it’s good if the container can store hevcWithAlpha.

let assetWriter = try! AVAssetWriter(outputURL: url, fileType: AVFileType.mov)
let videoConfigs: [String: Any] = [
AVVideoCodecKey : AVVideoCodecType.hevcWithAlpha,
AVVideoWidthKey : 1920,
AVVideoHeightKey : 1080]
let videoAssetWriterInput = AVAssetWriterInput(mediaType: .video, outputSettings: videoConfigs)
videoAssetWriterInput.expectsMediaDataInRealTime = true
assetWriter.add(videoAssetWriterInput)

Write CMSampleBuffer

Convert MTLTexture -> CIImage -> CVPixelBuffer -> CMSampleBuffer

func make(mtlTexture: MTLTexture, time: CMTime) -> CMSampleBuffer? {
let ci = CIImage(mtlTexture: mtlTexture, options: nil)!
CVPixelBufferLockBaseAddress(pixelBuffer, CVPixelBufferLockFlags(rawValue: 0))
context.render(ci, to: pixelBuffer)
var res: CMSampleBuffer? = nil
var sampleTiming = CMSampleTimingInfo()
sampleTiming.presentationTimeStamp = time
let _ = CMSampleBufferCreateForImageBuffer(
allocator: kCFAllocatorDefault,
imageBuffer: pixelBuffer,
dataReady: true,
makeDataReadyCallback: nil,
refcon: nil,
formatDescription: formatDescription,
sampleTiming: &sampleTiming,
sampleBufferOut: &res)
CVPixelBufferUnlockBaseAddress(pixelBuffer, CVPixelBufferLockFlags(rawValue: 0))
return res
}

Write the created CMSampleBuffer to videoAssetWriterInput.

videoAssetWriterInput.append(sample)

Finish Recording

Invoke finishWriting and a mov file will be created.

assetWriter.finishWriting(completionHandler: { })

Save Video

The video is saved to the specified url, so save it to the album.
ALAssetsLibrary is deprecated, so it would be better to use Photos framework.

ALAssetsLibrary()
.writeVideoAtPath(toSavedPhotosAlbum: url) { (url, error) in
print("url: \(url), error: \(error)")
}

Conclusion

The ability to save videos with alpha channels will be very useful and versatile!

--

--

f.yuki

Hello! I’m yuki! ios, swift, golang, unity etc…