Through the Dark: A creative, technical and emotional journey

hamishstewart
10 min readNov 12, 2016

--

Fall asleep
And lay with me
We’ll search the night for shooting stars
If they can’t take
Away your pain
I’ll take your hand, walk through the dark

In late 2015, the team at Google Play were looking to support Australian music by creating collaborative projects with some of the country’s top artists, and invited the R/GA Sydney team to work with them.

Exactly what form the projects would take, none of us really knew. All we did know was that we wanted them to be ambitious, interactive and authentic to the artist’s vision.

The Aussie hip-hop group Hilltop Hoods were an obvious choice. They’re as successful as any artist of recent times, with four number one albums, countless Facebook followers, and multiple ARIA Awards (the local music industry’s top prize).

The project they brought to Google was the track ‘Through The Dark’, an extremely personal song from one of the band members, Dan Smith (MC Pressure). It’s based on the true story of his then eight-year-old son Liam’s battle with leukaemia and the six months they spent in hospital as he underwent treatment. Daniel originally wrote the song just for his son, with no plans to release it until one of the other band members, Matt Lambert (Suffa), persuaded him to.

The song touched such a chord that fans from across the country contacted the group to tell their own stories of cancer and its impact on their lives. This inspired them to create a new initiative, Side of Stage, that would give young people affected by cancer the chance to go backstage to meet their favourite artists.

So how could we tell the story of ‘Through the Dark’, and help launch Side of Stage? Teams from R/GA Sydney, Google Play Music and Hilltop Hoods got together to talk about what could be. The outcome was a vision to convey this intensely personal journey through our most personal device: the mobile. How could we convey emotion and empathy in such a way that the technology enhanced and enriched the experience?

The initial concept

With that as our north star, the team developed numerous concepts, finally settling on the theme of light and dark. In a literal sense, the experience of caring for someone with cancer is filled with emotional highs and lows, good news and bad. But light and dark are also universal symbols of good vs evil, hope vs fear, and life vs death.

A story told in two worlds, dark and light.

From this came the idea of two worlds running simultaneously, allowing the user to control the journey for themselves. Using 3D cameras mapped to the phone’s accelerometer, the device would be tilted and rotated to reveal a father and son’s journey through two animated worlds — the dark and the light. We’d use mobile technology to capture the sense of a world turned upside down.

The team developed a prototype to demonstrate how it could work, and with that we were away.

The creative approach

With the concept confirmed, the team began developing the exact execution. Our goal was to give an emotional sense of the story without creating a literal narrative. Working closely with director Mike Daly, the team ultimately decided to combine symbolism with realistic elements of Daniel and Liam’s experience.

We also settled on animation, and a low poly style: partly because it enabled us to reduce file size and improve performance, but also because we loved the evocative work of Eran Hilleli, out of XYZ Studios in Melbourne.

Eran Hilleli’s haunting character design was perfect for the project.

Similar to crafting a 3D film, designing the experience started with concepts for the character and environment before undergoing pre-vis approvals and finally putting the elements into production. The team worked closely with the band on the character design, settling on representing the father and son as two mythical characters.

Father and son.

The first challenge: from desktop to browser

Next, we had the challenging task of running this entire 3D animated film in the browser. Before we began, we agreed that for our own sanity, and to manage download time, the experience had to be under 150MB. We knew it would be large, given that it was 4.5 minutes of animated footage with 13 different scenes.

To build the experience, we used several of the latest web standards, including WebGL, to render the 3D environment; Web Audio API, to play the audio across Desktop, Mobile and Tablet; Full Screen API to make the experience fullscreen; Screen Orientation API to lock the screen orientation; and DeviceOrientation Events to detect the orientation of the device.

Designed in Maya, each scene consisted of the geometry of the environment, shaders, lighting, cameras, camera paths and characters. One of the most challenging aspects of the project was developing a pipeline to translate all of this from Maya into the browser — not to mention, from a powerful desktop environment to a mobile device. Any characters with skeletal systems that had been animated had to be exported individually as Three.js JSON Objects, while the rest was exported as a COLLADA scene.

The dev team subsequently developed a configuration format so that we could specify a time interval, a COLLADA scene file, and where any characters in that scene could be exported at 0,0,0. All the offsets could then be stored and placed in the scene together to be animated and played as it would have in Maya. Or, at least, that was the intention.

The first attempt at exporting a character. Back to the drawing board.

After much R&D, and much back and forth between Maya and WebGL, we eventually ironed out the kinks and perfected a workflow for successfully moving everything between the two environments.

Ongoing challenges

The exporting issue was just one of the challenges we faced during the development process. Here are just a few others:

Instancing

In one of the scenes, the user flies through a forest. When we tried it initially, we exported each of the tree models individually, which made the scene 250MB. (Remember, our goal for the entire experience was 150MB).

Our solution was to replace trees with cubes, each of which had their own individual position, rotation and scale. It was then possible to export out the three different tree models separately and, at run time, replace the cubes with the tree models, copying their position, rotation and scale. This reduced the scene from 250MB to just 3.8MB. Progress…

Instancing allows us to render multiple copies of the same mesh in a scene at once.

Animating opacity, shader colours and light intensity

We also faced problems exporting animations from Maya. Some scenes included lights flickering, or objects fading in or out, or the colour of a face changing from one to another. But none of this detail could be exported out in the COLLADA format.

Our solution was to develop an attribute and tagging system: a reference cube could be made in Maya, with a Y position that could be animated from 0–1. The cube was then given a target object that it could apply effects to. So, if the target object was a light, for example, we could animate the light’s intensity to make it fade in, flicker or fade out. If the target object was a shader, we could use the cube to animate its colour from white to black. Or, if the target was an object and had the opacity attribute, we could use the cube to make the object fade in or out. This gave the animators full control over the scene in Maya, meaning no code had to be written for bespoke effects applied to individual objects.

The movement of the cube affects the opacity of the object.

Light rays

Our original plan had been to use volumetric lighting, but we quickly learned that this wasn’t going to be possible on mobile. Instead, pieces of geometry had to be made for the light rays and transparent textures applied to them to give the illusion of a beam of light. (Think 1998 game design.)

Shadows

About a quarter of the way through the project, when scenes started getting more complex, we discovered that mobiles didn’t have the power to run real-time shadows. So instead, to keep mobiles running at optimal performance, we baked shadows into each scene using geometry.

Cameras

The cameras are a critical aspect of the project. Movement between “worlds” is not always straight up and down. Sometimes we wanted to move around something in a scene, and to do so, the animators needed to have a fine grain control over the way the camera moved between the worlds. Our solution was to introduce four camera paths — one at the top, one to the right (middle), one at the bottom and one to the left (middle). The camera (the screen of the user’s device) could then be moved between these camera paths based on the rotation of the mobile or scroll on a desktop.

An additional challenge was keeping the scene upright while the mobile screen was being rotated by the user. To do so, the camera also had to be rotated based on the angle of the phone’s rotation.

Pre-loading/rendering

With 13 different scenes, it was impossible to load the entire film into memory at the same time. When the project first started, individual scenes were therefore created on the fly, behind a “curtain of black”, enabling those scenes to fade back in once loaded. As soon as one scene was finished, it would be cleaned up, unloaded and destroyed before the next would load and the process would repeat.

Depending on the power of the device, it could take anywhere between 0.5–2 seconds for this to happen, which none of us were happy with. To solve the issue, we implemented a feature that enabled us to specify the scenes that we wished to “preload” in memory depending on the power of the device.

With this new feature, several things happen when a scene is pre-loaded: a separate instance of WebGL is created for it, the COLLADA and character files are loaded in and added to the scene, and the first frame is rendered — all during the first loading screen of the experience. When it’s time for a scene to show, the black curtain fades in and the previous scene’s WebGL canvas is hidden while the WebGL canvas belonging to the next scene is shown. The black curtain then fades out, revealing the new scene. This feature cut the time between scenes down to fractions of a second.

Dev tools

We also developed bespoke dev tools for this project to allow us to fly around a scene in WebGL. It meant we could view the cameras to see what they were doing, interact with the lights, delete or move geometry, and deep dive into an individual object to play with its settings. This helped hugely when going back and forth with the animators to work out if something was not exporting properly from Maya, or if anything needed to be handled in a different way. These dev tools have been left in the experience as an “Easter Egg” — you can turn them on at any time in any scene simply by pressing the tilde button (~).

We developed bespoke dev tools to allow us to fly around the scene.

The completed project

The live project.

Developing this experience has been an incredible journey in itself, forcing us to create a number of new development tools and approaches that we hope will be used to create more immersive experiences going forward. Rapid prototyping, R&D, failing fast and iterating have all been key to the success of this project.

Launching Side of Stage

Of course, Through the Dark isn’t just a technical exercise. We were always conscious that we were telling a real, human story about a father and son going through a profound life experience. The project was launched on 11th November 2016 at CanTeen in Sydney, an incredible charity that helps people 12–25 when “cancer has turned your world upside down”. At the event, Hilltop Hoods and CanTeen launched Side of Stage, giving young people affected by cancer special access to live music shows across Australia.

For every download of the track, Google Play Music are donating $1 to CanTeen to help fund the initiative. For the R/GA Sydney team, it’s been awesome to be part of such an amazing project and to help further such a worthwhile cause.

We should also mention that Liam is now living cancer-free.

Where to try ‘Through The Dark’

You can experience ‘Through the Dark’ on high-end Android devices and on desktop by visiting throughthedark.withgoogle.com

And here’s a preview:

Acknowledgements

The project was an intense, collaborative effort over many, many months. Huge shout outs to Hilltop Hoods and the teams at Google, Universal Music, XYZ Studios, Exit Films, CanTeen, Poem and, of course, R/GA.

  • Written by Blake Kus, Associate Technology Director; Michael Armstrong, Creative Director; and Hamish Stewart, Executive Creative Director, R/GA Sydney.

--

--