Using Sky Segmentation to create stunning background animations in iOS

Eric Hsiao
4 min readJun 24, 2019

--

Image segmentation is a computer vision task that labels each pixel in an image with the type of object it belongs to. In short, it helps us better understand images / video so that we can build engaging experiences that users love.

In a previous post, we explored how developers can use People Segmentation in order to create a portrait mode effect by separating the background from people detected in an image and then blurring it. Here, we’ll work with another flavor of on-device image segmentation: Sky Segmentation.

Sky Segmentation works the same way as People Segmentation, using an on-device machine learning model to allow developers to associate pixels of an image or video frame with the sky. For this tutorial, we’ll work through the steps to create a sky animation using Image Segmentation by Fritz.

Starting out

Download and open the repository for this tutorial.

git clone git@github.com:fritzlabs/fritz-examples.git

It’s easiest to set up Fritz using Cocoapods. In iOS/FritzSkyReplacementDemo starter project folder, run:

pod repo update
pod install

Open the FritzSkyReplacementDemo.xcworkspace in Xcode.

Overview

We’ll be using the Fritz iOS Image Segmentation feature to generate masks for pets in photos. The Fritz SDK comes with a variety of pre-built features that run directly on your phone.

All Fritz Vision APIs use a few constructs:

  • FritzVisionImage: The image that the model runs on. It will wrap the provided pixel buffer or another image you provide.
  • Options: Configuration options that are passed to the model letting you tweak how the model runs.
  • Model: The actual model that runs predictions on the input images.
  • Results: The output of the model. Each predictor has a different type of results. In this tutorial, the results are a list of FritzVisionSegmentationResult objects.

Setup Fritz Account

Setting up a Fritz Account is easy. Follow the Getting Started directions to setup your free account and connect the demo to your account. Here are the steps you’ll run through:

  1. Create a free account for developers.
  2. Create an iOS App. Make sure that the Bundle ID of your project matches the one you created.
  3. Drag the Fritz-Info.plist file to your project.

After you run through the initialization steps, build and run your app. When your app has successfully checked in with Fritz, you’ll see a success message in the webapp.

Using Sky Segmentation

Let’s use sky segmentation on an image and then replace the sky pixels with a new, sliding background.

Left: Original Image | Top Middle: Sky photo to replace | Top Bottom: Original image without the sky | Right: Combine background sky with the bitmap.

In our app, first let’s load 2 images:

  • A foreground image that we’ll run sky segmentation on.
  • A background image that we’ll use to animate across the sky.
Left: foreground image (mountains.jpg) | Right: background image (clouds..png)
let foreground = UIImage(named: "mountains.jpg")
let background = UIImage(named: "clouds.png")

Now we’re ready to pass the image into the model. Let’s walk through this step-by-step.

  1. We start by setting some parameters that will be used to adjust the model output.
In step 3, we’ll use these parameters to adjust the model ouput when the buildSingleClassMask method is called.

2. We then create a FritzVisionSkySegmentationModel() that will be used to cut out the sky from the image.

private lazy var visionModel = FritzVisionSkySegmentationModel()

3. The foreground image is fed into the segmentation model which returns a FritzVisionSegmentationResult object. For more details on the different access methods, take a look at the official documentation.

The segmentation result contains information on the likelihood that each pixel belongs to a particular class (e.g. background or sky). We use the buildSingleClassMask method to create an alpha mask from just pixels that the model thinks does not identify as the sky.

4. That mask is combined with the original image to produce an image with a transparent sky.

5. Add the background UIImage loaded. In this case, we’ll use 2 background views and then animate them by creating a sliding motion that repeats indefinitely.

Here’s the final result:

Check out our GitHub repo for the finished code.

By being able to easily identify the sky, we open up the door for developers to create novel features.

  • Animate the background sky — add animated stars to the sky to give it some pop; create a moving background that brings the photo to life.
  • Sky replacement — Compose and edit the background of videos and photos. Want to create a movie poster for Godzilla? Find a photo of a city skyline and swap out the background.
  • Photo effects targeting the sky — Add filters specifically for the sky without the need to apply it to the entire background. Add a reddish glow to the sky as the sun comes up in the morning.

When working with video and photo apps, you can add intelligent, computer vision experiences — all without a PhD in machine learning. I hope that this tutorial showed how easy it is for any developer to use Image Segmentation by Fritz. Leave a comment and let us know what you build!

--

--