Let’s Make a Promo

A look behind the scenes at making videos for apps.

Dave Fumberger
9 min readSep 1, 2014

In today’s App Store environment, it’s more important than ever to market and demonstrate your apps in the best way possible. A video is essential for informing potential customers what your app is all about. With Apple soon giving developers the option to feature videos on iTunes, a good promo is something every app should have.

Since the relaunch of our app Beatwave, we’ve produced three videos that have been essential in marketing the app. Arguably, the most important part of a video is when the app is shown, and so making sure it looks great is top priority.

In this post I’ll be detailing the methods behind each of our videos, focusing in particular on our efforts for getting great shots of the device running the app.

Video 1 — Beatwave Launch

Shooting direct

The first video for Beatwave was the launch video:

In this video, each shot of Beatwave running on the device was filmed directly, with all footage coming straight out of the camera and no composite work.

The production was filmed using a Blackmagic Cinema Camera with just natural lighting from the windows of the room.

The main issue shooting direct in this case was getting a crisp shot of the screen. Because the device was quite close to the camera there was a degree of moiré pattern introduced into the footage.

Example of moiré pattern

A moiré pattern is an unwanted visual effect caused by interactions between the pixel grids of the camera’s sensor and the device’s screen.

The effect was initially reduced by a very slight unfocusing of the camera, but as the actor’s hand subtly moves, it would tend to reappear as the device went in and out of focus. The remaining moiré needed to be removed in post with some tricky After Effects work.

Some futher post production was required to fix colour and contrast issues, but overall the camera did a pretty amazing job of capturing Beatwave running on the device directly.

Getting the shots

The biggest constraint on the production was time, so we needed the shots as quick as possible. This meant being able to give clear and concise directions to the actor on the interactions required.

To assist this goal, I created a simple video player app that would playback a piece of recorded Beatwave footage. The idea was for the actor to simply mime the interactions instead of having to do them live.

This video player app was duplicated for each shot of him using Beatwave:

The process for filming was:

  • The actor would load the video player app for the interaction sequence that was being shot;
  • The video player app would play a pre-recorded video showing the interaction sequence (The video was created using the screen capture method detailed in the next section of this post.);
  • The actor would hear three tones, with the third tone signifying when they needed to perform the first tap / move; and
  • If the actor missed a queue or tap then the video could be restarted simply by reopening.
http://youtu.be/BeDPdq_HBZQ

This clip shows the original footage from the final video, and then a demo of video player app in it’s raw form.

The actor still had to learn what ‘moves’ were to be done after the tone counted him in, but he was able to do this quite quickly, which resulted in a much faster shoot than if he were using Beatwave live.

If you’re producing a video that requires interactions that are more than a simple tap or swipe, having a way for the actor to mime the interactions is highly recommended.

Video 2 — Beatwave 2.1

Screen Capture

The second Beatwave video produced highlighted the new features of version 2.1:

The approach for this video was the complete opposite to how the launch video was produced, no cameras were involved.

All footage was recorded using screen capture, and then composited back into a virtual scene with the iPhone.

Options for screen capturing a direct feed of an app include:

  • Recording from the iOS Simulator using a screen-casting app such as ScreenFlow;
  • Over AirPlay Mirroring using an app such as Reflector; or
  • Using the new screen recording feature in iOS8.

In this video, we used AirPlay Mirroring with Reflector.

Device and animation

The iPhone in this video was generated in After Effects using the plugin Element 3D. Element 3D is a great plugin that lets you animate 3D content inside After Effects.

The 3D model used was from this pack http://videohive.net/item/professional-3d-device-pack-for-element-3d/7139714, which provides a number of high quality Apple models at a incredibly cheap price.

Having a 3D model lets you position and animate the device however you like, opening up the possibility for all sorts of animations.

Interactions

Combining a screen recording into a 3D scene gives an incredibly clean result, but the downside is you don’t automatically get a way of showing the interactions.

In this video the interactions were quite simple, so we could manually composite a hand into the scene to show the taps.

I used a chroma keyed hand that I found on YouTube rather than filming my own.

In instances where we needed a more complex interaction than a tap, we could simply freeze the video and move the hand using keyframes.

It’s of course always going to be more natural to film the actual interaction rather than animating it, but considering the limited time and resources this effect worked quite well, and isn’t generally noticeable to anyone who isn’t looking for it.

Compositing

Once I had the hand, the recorded screen footage, and the 3D iPhone, it was then just a matter of layering these in After Effects to create the final composition.

Video 3 — Beatwave Demo

Combining live and screen capture

The most recent video created for Beatwave was a tutorial. This video demonstrated building up a tune from start to finish in 3 minutes:

The initial approach for the demo was to simply record the device directly using a Canon 6D. But the test shots revealed exposure problems that resulted in the quality being less than ideal.

Over Exposed Screen
Under Exposed Background

Self-illuminating devices like the iPhone and iPad can make life tough for the camera and photographer. In the examples above, you can see the battle trying to give the scene an even exposure between the environment and the device.

Being a tutorial video, it’s essential the interface of the app is clear to the viewer. This required the screen be shown with the best quality, and so with not being able to get the desired results directly filming, the screen needed to be recorded using screen capture.

This then raised the question of how to show the app being used. Unlike the second video, the interactions in this video were quite complex, and would have been too time consuming to reproduce by animating a pre-recorded hand.

This meant the video had to combine live footage of the hands with a screen recording of Beatwave’s interface.

Recording the hands

The entire scene was recorded on my desk using a Canon 6D and a standing lamp for some extra lighting. Part of the goal here was to create a setup that didn’t require too much space or equipment so I could easily record future demo videos without having to move too far from my desk.

In addition to the scene being green, the iPad also had a green overlay on the screen:

The green overlay was added in code to the app itself. It was simply a view with a background color of

[UIColor colorWithRed:0.0 green:1.0 blue:0.0 alpha:0.75];

overlaid on the entire UIWindow.

Recording the screen output

It was important to be able to easily chroma ‘key’ the iPad screen out, this meant the requirement for a green overlay on the iPad screen was a must.

Wrong.

My first attempt was to simply add the green UIView overlay to the app, then use AirPlay Mirroring to record the screen. Of course, as I quickly discovered, the obvious problem was the recording also turned out green. Not good.

The solution I settled on was to ‘mirror’ touches from Beatwave running on the iPad to seperate instance of Beatwave that was running on the iOS Simulator.

Every touch I made on Beatwave running on the iPad would be intercepted and then sent over the network to the iOS Simulator. When the iOS Simulator received it the touch, it would replicate it on the version of Beatwave it was running.

The code that performed the touch mirroring was a library I purpose built for the project called ‘CTMirror’. If you’re interested trying a similar approach it can be found here: https://github.com/Collect3/CTMirror.

An alternative to using CTMirror could have simply been some capacitive green tape / celophane and AirPlay Mirroring. Though some advantages of the touch mirroring method in this instance were:

  • No audio lag present unlike if using AirPlay Mirroring
  • Fine control over the tint colour and brightness, including the ability to change the color / contrast of controls in the app if needed; and
  • Increased reliability of the recording of the screen output vs AirPlay Mirroring.

Compositing

I was then able to combine both the video of the simulator and the video of the hands in After Effects to create one final video composition.

The device shown above was again created using Element 3D. The blue background that represents the ‘table’ was a render produced in Cinema 4D. The 3D models were free and paid assets from turbosquid.com.

Tools Used

Summary of the tools used in the above methods:

Wrapping up

I hope that has provided insight into methods that we’ve found useful producing iOS videos.

The three Beatwave videos have taught us there’s generally not a one size fits all solution, with each project ultimately requiring different combinations of techniques depending on the content and message of the video.

Would love to hear feedback on other filming methods or suggestions to improve on the ones I’ve outlined, so hit me up at @djfumberger or dave@collect3.com.au

--

--