Replicating tvOS Parallax Focus Effect on Custom UIViews

Nobody likes blockers; you’re coding away, and suddenly :boom: you hit a wall because there is a key piece of information missing. Many times, going into your mind palace or Google-ing your questions doesn’t yield any results — but have no fear. Inside the Odd Networks Developer blog, you can find helpful tips, hacks, and creative solutions to problems that might find their way into your projects. We want to make sure that you can focus your attention entirely on building new, state-of-the-art, cross-platform media applications.

Patrick, our iOS and tvOS developer, has shared information he collected while working on replicating tvOS parallax focus effect on custom UIviews. Without further ado, let’s move on to these generous and helpful instructions (accompanied by code examples).

There is a full sample project here if you want to follow along while reading.

iOS 7 introduced the concept of parallax views. When the user tilts their device the icons on the home screen will appear to shift position simulating a 3d effect.

With the Focus Engine, tvOS takes this to the next level where individual views respond to a user’s touch on the Siri Remote. This is a new interface paradigm and it takes some getting used to for users as well as developers.

By now you have seen examples of Apple TV applications and how the interface elements tilt, zoom and pan when you highlight them. Apple created a whole new way to interact with the user interface and they built the Focus Engine to do so.

But with any new technology there are growing pains. You will quickly discover if you drop some UIButtons or UIViews onto your Storyboard that many times they don’t behave like the new shiny focusable elements you want.

For a recent customers Apple TV app I was faced with a simple button with an icon and a text label. When the button was selected or in our brave new world, ‘focused’ it did nothing. This is not my first Apple TV app and I already understand things like responding to focus and coordinating the changes in focus with the Animation Coordinator. If you’re not familiar with these things a good place to start is Controlling the User Interface with the Apple TV Remote

What I wanted to get to the bottom of is how do you mimic the effect Apple uses when you focus on an image or other view.

I did what any self respecting developer would do and I googled it. I found some solutions based on older implementations around when iOS 7 first introduced the parallax home screen but nothing that show me how to mimic Apple’s effect.


Implementing the Parallax Effect

With iOS 7 Apple introduced UIMotionEffect as an abstract superclass that would help developers create new effects on views when the device was tilted. Working directly with UIMotionEffect subclasses while possible is tricky. You must conform to a few protocols and implement some methods to handle the interpolation between the devices orientation and the changes to your view. Thankfully they also provided UIInterpolatingMotionEffect to make the whole process much simpler

UIInterpolatingMotionEffect allows you to basically decide what device orientation change you want to track and map that to some changes on your view. The Focus Engine then handles figuring out how much of the effect should be applied to your view based on ‘strength’ of the device orientation change.

Right now UIInterpolatingMotionEffect only implements two device orientation parameters to track, horizontal and vertical tilt. For tvOS these two are tied to the user’s finger location on the Siri remote.

So if we were to create a new motion effect and track the horizontal tilt our effect would be more pronounced the further the user’s finger is from where they started they swipe gesture on the remote. While UIMotionEffect also works on iOS devices we will limit the discussion here to the Apple TV.

The other piece of the puzzle is what do we change about the view when these touches occur?

Any property on a view that can be animated can be added to a motion effect. So properties like the views scale or center position can be connected to the user’s touch.

An example of a UIInterpolatingMotionEffect would look something like this:

xPan = UIInterpolatingMotionEffect(keyPath: “center.x”, type: .TiltAlongHorizontalAxis)
xPan.minimumRelativeValue = -10
xPan.maximumRelativeValue = 10

What this says is when the user swipes on the remote horizontally we want to offset our views center along the X axis by up to 10 pixels.

The two required values minimumRelativeValue and maximumRelativeValue are the limits of this offset. So when the user swipes to the left as far as the remote will allow the view will shift 10 pixels and the same for the other direction. The xPan motion effect will handle deciding how much of that 10 pixel amount to offset the view depending on where the user’s finger is in their swipe gesture.


Actually Mimicking Apple’s Effect

I wanted to try to match Apple’s effect as best as possible and this is what I came up with:

  • The view should rotate about its center in relation to the swipe gesture
  • The view should move slightly along the axis related to the swipe gesture
  • The view should get a bit bigger when focused
  • The view should show a shadow below itself while focused

Not all of this can go into a motion effect some of these changes will be handled elsewhere in your code.

func addParallaxMotionEffects(tiltValue : CGFloat = 0.25, panValue: CGFloat = 5) {
 var xTilt = UIInterpolatingMotionEffect()
var yTilt = UIInterpolatingMotionEffect()
 var xPan = UIInterpolatingMotionEffect()
var yPan = UIInterpolatingMotionEffect()
 let motionGroup = UIMotionEffectGroup()
 xTilt = UIInterpolatingMotionEffect(keyPath:
“layer.transform.rotation.y”, type: .TiltAlongHorizontalAxis)
xTilt.minimumRelativeValue = -tiltValue
xTilt.maximumRelativeValue = tiltValue
 yTilt = UIInterpolatingMotionEffect(keyPath:
“layer.transform.rotation.x”, type: .TiltAlongVerticalAxis)
yTilt.minimumRelativeValue = -tiltValue
yTilt.maximumRelativeValue = tiltValue
 xPan = UIInterpolatingMotionEffect(keyPath: “center.x”, type:     .TiltAlongHorizontalAxis)
xPan.minimumRelativeValue = -panValue
xPan.maximumRelativeValue = panValue
 yPan = UIInterpolatingMotionEffect(keyPath: “center.y”, type:    .TiltAlongVerticalAxis)
yPan.minimumRelativeValue = -panValue
yPan.maximumRelativeValue = panValue
 motionGroup.motionEffects = [xTilt, yTilt, xPan, yPan]
self.addMotionEffect( motionGroup )
}

The method above is part of an extension on UIView that I created to match Apple’s parallax effect as best as possible.

The method allows setting the values for both the tilt and panning of the view. It provides defaults so these parameters are optional.

func addParallaxMotionEffects(tiltValue : CGFloat = 0.25, panValue: CGFloat = 5) {

In order to tilt the view we the view we set the keypath string to the views layer.transform.rotation for each axis. This causes the view to appear to rotate about its center on either the X or Y access as appropriate. Note that the rotation is on the Y axis which means rotate about the Y axis not rotate vertically.

xTilt = UIInterpolatingMotionEffect(keyPath: “layer.transform.rotation.y”, type: .TiltAlongHorizontalAxis)

For the panning effect we use the same concept but apply our effect to the view’s center property.

xPan = UIInterpolatingMotionEffect(keyPath: “center.x”, type: .TiltAlongHorizontalAxis)

If this method is added to your project as an extension on UIView you can get this effect by allowing the view to accept focus:

override func canBecomeFocused() -> Bool {
return true
}

And then calling the method on your view

self.addParallaxMotionEffects()

This will get your view tilting and panning while you swipe around on the Siri Remote but what about the zooming and shadows?

Since the zoom is not related to the user’s touch of the remote it doesn’t make sense to add it to the motion effect. Shadow is not a property that can be animated so it too can not be added to the motion effect.

The other thing we neglected to discuss to this point is we only want this effect applied to our view when it has focus. If you just add the motionEffect when the view is loaded it will always track the users’ movements on the remote and is definitely how Apple intended you to use the Focus Engine.


DidUpdateFocus… to the rescue

Apple has provided a method in the UIFocusEnvironment protocol that you can implement to be notified when your view becomes focused. This sounds like a good place to add our shiny new motion effects and other goodies.

If you subclass UIView or one of its subclasses you can add a method something like this:

override func didUpdateFocusInContext(context: UIFocusUpdateContext, withAnimationCoordinator coordinator: UIFocusAnimationCoordinator) {
super.didUpdateFocusInContext(context, withAnimationCoordinator: coordinator)
 guard let nextFocusedView = context.nextFocusedView else { return }
 if nextFocusedView == self {
self.becomeFocusedUsingAnimationCoordinator(coordinator)
self.addParallaxMotionEffects()
} else {
self.resignFocusUsingAnimationCoordinator(coordinator)
self.motionEffects = []
}

I won’t go into too much detail about how this works but you can refer to Controlling the User Interface with the Apple TV Remote for more detailed information.

This method will be called when your view receives focus. To be sure we are a good subclass we call the super’s implementation of the method first

super.didUpdateFocusInContext(context, withAnimationCoordinator: coordinator)

When the method is called you get two views as parameters. One is the view about to be focused, nextFocusedView and the other is the view that had focus previously, previouslyFocusedView. These two goodies are hiding in the UIFocusUpdateContext so we need to get at them with this line:

guard let nextFocusedView = context.nextFocusedView else { return }

This makes sure the nextFocusedView is not nil and skips all our machination if it is.

Then we get down to actually adding and removing effects depending on whether we are about to be focused or are about to lose focus

if nextFocusedView == self {
self.becomeFocusedUsingAnimationCoordinator(coordinator)
self.addParallaxMotionEffects()
} else {
self.resignFocusUsingAnimationCoordinator(coordinator)
self.motionEffects = []
}

Here we check if we are gaining or losing focus and call the appropriate helper methods as well as our nice new class extension method addParallaxMotionEffect

So what about those helpers

func becomeFocusedUsingAnimationCoordinator(coordinator: UIFocusAnimationCoordinator) {
coordinator.addCoordinatedAnimations({ () -> Void in
self.transform = CGAffineTransformMakeScale(1.1, 1.1)
self.layer.shadowColor = UIColor.blackColor().CGColor
self.layer.shadowOffset = CGSizeMake(10, 10)
self.layer.shadowOpacity = 0.2
self.layer.shadowRadius = 5
}) { () -> Void in
}
}

In order for your animations to look correct Apple gives you an animation coordinator. It’s not clear what exactly this does but one can imagine it makes sure the unfocusing changes happen in time with the focusing changes. For now just wrap your changes within the addCoordinatedAnimations closure.

Our changes on receiving focus are:

Zoom the view 110%

self.transform = CGAffineTransformMakeScale(1.1, 1.1)

Configure the view’s shadow effect:

self.layer.shadowColor = UIColor.blackColor().CGColor
self.layer.shadowOffset = CGSizeMake(10, 10)
self.layer.shadowOpacity = 0.2
self.layer.shadowRadius = 5

There is a corresponding helper method in the sample project linked at the bottom of this post that reverses these steps to remove the focus effect.

Once all this is hooked up, and it’s not all that much, you should have an effect approximating Apple’s parallax focus effect. It’s not a perfect match we don’t have the highlighting effect Apple uses for example.

Most of what I covered here is applicable to iOS with the exception of the focus updates. The motion effect code should work in an iOS project but you will need to trigger the effects in a different manner. If you make use of this with an iOS project I would love to hear about it.

I hope others find this helpful. I had a hard time finding all this information in one place so here is what I have learned thus far. I welcome suggestions, correction or any other input.

p.


And there you have it. We hope that Patrick’s findings were helpful in overcoming your blockers. Bookmark this page for future use, and follow us here or on Twitter for more updates about all things OTT.

You can also find us on AngelList.

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.