Dissecting Google’s Mobile Animation Engine — Part 1

Recently you may have noticed that Google is adding animated intro screens to their apps. More than just animations, almost like little cartoons. I first noticed it in Google Photos but its showing up in other places now. Here is an example from the Google Photos app.

At first glance you may think that this is an animated GIF, or some kind of embedded movie, but given that this is Google I thought it warranted a closer look.

First thing I did was use the Android Device Monitor tool to examine the UI and sure enough these are actual View animations!

These animations are too complicated to be doing painstakingly in code, and given the fact they use the same animations on iOS there must be some Google magic at play here.

I pulled down the Google photos APK and extracted it. Unfortunately the code was obfuscated, and no indications of any libraries being used here. I did however notice something really interesting in the assets directory. Under assets/images were all of the images used in the intro .

And the images were referenced in an assets file assets/intro.btfy

These .btfy files seem to be JSON representations of the animation screens, and based on the comment field it seems as though these animations are exported from Adobe After Effects. Is it possible that we can construct the entire animation at run time using these .btfy files?

{
"version": 1,
"comment": "'20150313T13:08:29-OOB attract loop V2' exported by AE Butterfly Format Exporter v0.37",
"type": "stage",
"aspectRatio": 0.5625,
"size": {
"width": 720,
"height": 1280
},
"animations": [
{
"type": "animationGroup",
"id": "BG-gooBlue-3",
"duration": 0,
"initialValues": {
"size": {
"width": 2,
"height": 1
},
"anchorPoint": {
"x": 0.5,
"y": 0.5
},
"scale": {
"sx": 1,
"sy": 1
},
"rotation": 0,
"opacity": 1,
"position": {
"x": 0.5,
"y": 0.5
},
"backgroundColor": {
"r": 0.258823543787,
"g": 0.52156865596771,
"b": 0.95686274766922,
"a": 1
}
},
"shape": {
"name": "rectangle"
},
"animations": [

]
},
...
}

The first step in rendering these animations is reading the JSON data into memory. I started by creating a new Android project and coping the intro.btfy and Images directory from the Google Photos APK into the new projects’ assets folder. Once the .btfy file and Images directory were in place it was time to start serializing the JSON into Java objects.

I initially thought of using GSON to parse the data, but ran into issues with the keyframes property. The keyframes property is a list of objects, but the objects varied depending on the property property. For instance if the property was position the key frames would look like this.

[
0,
{
"x": 1.1875,
"y": 0.22421875
},
{
"name": "cubic-bezier",
"x1": 0.0001,
"y1": 0,
"x2": 0,
"y2": 1
}
]

But if the property was opacity the key frame data would look like this.

[
0,
0,
{
"name": "cubic-bezier",
"x1": 0.33333333,
"y1": 0,
"x2": 0.66666667,
"y2": 1
}
]

Using GSON I would have to make the keyframes a list of arbitrary Objects, but I wanted a little more type safety so I chose to implement the JSON parsing using the Jackson2 parser. Using Jackson2 meant I can examine the JSON and construct the data as I parsed it.

Now that we have our approach writing the code was a bit tedious but straightforward.

Here is a small snippet.

public BtfyAnimationGroup parseAnimationGroup(JsonNode jSONObject) throws BtfyParseException {
String typeStr = jSONObject.path("type").asText();
if (typeStr.equals("animationGroup")) {
String jsonId = jSONObject.path("id").asText();
if (TextUtils.isEmpty(jsonId)) {
throw new BtfyParseException("Animation missing id.");
}

if (!jSONObject.has("initialValues")) {
throw new BtfyParseException("Animation group missing initial values.");
}

String parentId = jSONObject.path("parentId").asText();

BtfyShape shape = parseShape(jSONObject);

long duration = Math.round(jSONObject.path("duration").asDouble() * 1000.0d);

String textStr = null;
if (jSONObject.has("text") && !TextUtils.isEmpty(jSONObject.path("text").asText())) {
try {
textStr = this.stringMap.getString(jsonId, jSONObject.path("text").asText());
} catch (RuntimeException re) {
textStr = jSONObject.path("text").asText();
}
}


String backgroundImageStr = jSONObject.path("backgroundImage").asText(null);

ArrayList<BtfyAnimationElement> animationElements = new ArrayList<>();
parseAnimations(jSONObject, animationElements);

JsonNode jsonInitialValues = jSONObject.get("initialValues");
BtfySize initialValuesSize = parseSize(jsonInitialValues.path("size"), "Missing size in initial values.");

BtfyPoint anchorPoint = jsonInitialValues.get("anchorPoint") != null ? parsePoint(jsonInitialValues.get("anchorPoint")) : new BtfyPoint(0.0f, 0.0f);
BtfyColor backgroundColor = jsonInitialValues.get("backgroundColor") != null ? BtfyColor.parseRGBA(jsonInitialValues.get("backgroundColor")) : BtfyColor.transparent();
float opacity = (float)jsonInitialValues.path("opacity").asDouble(1.0d);
BtfyPoint position = jsonInitialValues.get("position") != null ? parsePoint(jsonInitialValues.get("position")) : new BtfyPoint(0.0f, 0.0f);
BtfyPoint scale = jsonInitialValues.get("scale") != null ? parseScale(jsonInitialValues.get("scale")) : new BtfyPoint(1.0f, 1.0f);
int rotation = jsonInitialValues.path("rotation").asInt(0);

float textSize = (float) jsonInitialValues.path("fontSize").asDouble();
BtfyColor textColor = jsonInitialValues.get("textColor") != null ? BtfyColor.parseRGBA(jsonInitialValues.get("textColor")) : BtfyColor.white();
Typeface typeface = parseTypeface(jsonInitialValues.path("textStyle").asText());
int textAlign = parseTextAlign(jsonInitialValues);

return new BtfyAnimationGroup(jsonId,
TextUtils.isEmpty(parentId) ? null : parentId,
shape,
duration,
textStr,
backgroundImageStr,
new BtfyInitialValues(
initialValuesSize,
anchorPoint,
backgroundColor,
opacity,
position,
scale,
rotation,
textSize,
textColor,
typeface,
textAlign
),
animationElements);
}

throw new BtfyParseException("Unexpected animation group type", String.valueOf(typeStr));
}

Now let’s examine the file format. The root object is of type “stage” and appears to contain some meta data, width and height and the aspect ratio. Given that there was only one .btfy file it is fair to assume that we are going to be doing our own scaling rather than rely on Android’s resource resolution framework.

{
"version": 1,
"comment": "'20150313T13:08:29-OOB attract loop V2' exported by AE Butterfly Format Exporter v0.37",
"type": "stage",
"aspectRatio": 0.5625,
"size": {
"width": 720,
"height": 1280
},
...
}

Let’s start by creating a custom view that automatically resizes itself to always maintain an aspect ratio provided by the stage property.

public class BtfyAnimationView extends ViewGroup {

// ... load .btfy into a BtfyStage object at property this.btfyStage

@Override
protected void onMeasure(int widthMeasureSpec, int heightMeasureSpec) {
if (this.btfyStage == null) {
setMeasuredDimension(MeasureSpec.getSize(widthMeasureSpec),
MeasureSpec.getSize(heightMeasureSpec));
return;
}

BtfySize size = this.btfyStage.sizeInfo;
int resolveSizeWidth = resolveSize(size.width.intValue(), widthMeasureSpec);
int resolveSizeHeight = resolveSize(size.height.intValue(), heightMeasureSpec);

float sizeRatio;
if (!(resolveSizeWidth == size.width && resolveSizeHeight == size.height)) {
sizeRatio = size.width / size.height;
float resolvedRatio = ((float) resolveSizeWidth) / ((float) resolveSizeHeight);
if (resolvedRatio > sizeRatio) {
resolveSizeWidth = Math.round(((float) resolveSizeHeight) * sizeRatio);
} else if (resolvedRatio < sizeRatio) {
resolveSizeHeight = Math.round(((float) resolveSizeWidth) / sizeRatio);
}
}

int measuredWidth;
if (resolveSizeWidth < MeasureSpec.getSize(widthMeasureSpec)
|| resolveSizeHeight < MeasureSpec.getSize(heightMeasureSpec)) {
sizeRatio = (float) Math.min(MeasureSpec.getSize(widthMeasureSpec) / resolveSizeWidth,
MeasureSpec.getSize(heightMeasureSpec) / resolveSizeHeight);
measuredWidth = (int) (((float) resolveSizeWidth) * sizeRatio);
resolveSizeWidth = (int) (((float) resolveSizeHeight) * sizeRatio);
} else {
measuredWidth = resolveSizeWidth;
resolveSizeWidth = resolveSizeHeight;
}
setMeasuredDimension(measuredWidth, resolveSizeWidth);
}
}

We’re using a ViewGroup (line 1) as the base class since we’ll be doing our own layout of subviews. We then override onMeasure to read the values from the stage and map them into the available space we have on screen for this view (line 13–15). Finally we find the longest edge (width or height) and set the other dimension accordingly so that the orientation and aspect ratio is maintained (line 18–39). If we set the background to green and run it we should see output similar to this.

And if you rotate the device the aspect ratio should be preserved.

Like what you read? Give James Ayvaz a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.