Pablo the Flamingo

Case study: The birth of a party animal

Pablo is a flamingo that loves life — as long as he’s dancing, that is. If you haven’t yet, you can catch him at home at pablotheflamingo.com.

Pablo was a side-project brought to life through a collaboration, art directed by Pascal Van der Haar and beautifully illustrated by Jono Yuen.

Several months ago, my mate Pascal told me about an idea he had to make a ‘one-page site of a flamingo head-banging to Eve’s Let me blow ya mind’ — the kind of thing that’s hard to refuse.

Fast-forward to today, and Pablo is alive, and happily dancing until the end of his days. We’ve been absolutely blown away by the response — in his first two weeks, Pablo has been seen by over 60,000 people world-wide and been awarded site of the day at FWA and Awwwards. What is by far the most rewarding, however, is to read about the individual experiences of how Pablo has brought a brief moment of joy to many peoples’ lives.

Here are a couple of tweets among many that brought a smile to our faces.

https://twitter.com/demoderapaz/status/520846578744328192
https://twitter.com/bedlogic/status/519573162669789186

I’m going to delve into a few of the technical challenges, and share in detail about how each was overcome. These will include:

  1. Making Pablo dance
  2. Applying movement to visuals
  3. Animating Pablo’s facial expressions
  4. Interacting with Pablo

So without further ado, let’s get stuck in!

Challenge 1: Making Pablo dance

I wanted to find a way of animating Pablo using physics, allowing the user to interact with the animation, and create their own unique experience.

An old experiment I made at ultranoir came to mind, in which I created a basic rig to control the deformation of an SVG flour sack. In this experiment the points are moved manually by an animator, but the concept is there.

I’d been waiting for an opportunity to apply this technique to something other than SVG, and thanks to the rigging support inside of Threejs, this could be done very simply in 3d. The idea would be to import a very simple, flat, rigged object of Pablo, and to animate the position of several joints/bones which then in turn control the vertices. In order to move these joints, I first needed to find a way to harness the physics-side of things.

I didn’t really want to import Box2d — a very reliable, but heavy 2d physics engine — for such a simple project. I instead looked at P2.js, which is a good 2d physics engine written in javascript. The creator had made this example of a lock constraint which looked like it could give me the kind of ‘bounce’ movement I was after. But after struggling for a while, I let it go, and actually didn’t touch the project for a few months.

Not until I came across Matterjs, did I use Pablo as an excuse to familiarise myself with a new library, and pick up from where I left off.

I really loved Matterjs’ cleanliness of code and simplicity. It even comes with a pixijs renderer to let you get straight into trying things out.

For Pablo, I only needed very simple functionality, and after seeing the example on soft-bodies, I was inspired to get cracking on the project. The soft bodies bounced in a way that I found would work perfectly for Pablo.

I animated the soft bodies by changing the global gravity over time, in both the x and y axes. Making the gravity change from negative to positive made the bodies bounce up and down, back and forth. The effect wasn’t exactly what I had envisioned, but it was still making me giggle. Quite a lot. I tweaked these values and the shape of the soft-bodies until I was happy.

Below is a codepen, showing exactly the code that went into achieving this effect.

The most important part to be noted is that which affects the gravity. It can be summed up in this following line, which can be found within the tick function, called upon every frame.

gravity.y = yScale * Math.sin(20 * timestamp / (Math.PI * timing));

It uses a sine curve to animate the value between +yScale and -yScale over a period defined in the timing variable. I used this variable to sync the movement with the beat of the song. For this song, the value was 675 milliseconds. Timestamp is a counter, referencing the number of milliseconds since the page initialised.

To add a little more variety, I also animated the scale variables (like yScale), making his dance change over time.

If you add ‘?dev’ to the url of the website — you can see exactly how this is implemented. Take note of the changing x and y values in the controls. These are the x and y scale as mentioned above.

Challenge 2: Applying movement to visuals

For the creation of Pablo himself, I received some gorgeous illustrations from Jono. I then started translating them to 3d using Autodesk Maya.

Here you can see the basic topology of Pablo’s final model - the illustration applied as a texture.

It was important to keep the poly count as low as possible as I wanted the project to load quickly and animate at 60fps, even on devices.

Next I created the rig/skeleton by first placing the joints, selecting the joints and the mesh, and applying the automatic smooth bind algorithm.

This attaches the joints to the vertices, allowing me to move only the joints in order to animate the mesh.

I then came upon a tricky problem — how to translate the positions of the soft-body particles correctly to the joints. Looking at the first two images below, you can see that the shapes don’t really match up. I tried to apply approximate offsets, but it was still morphing Pablo’s body out of shape.

I solved this by editing the starting positions of the joints, as is shown in the right image. Because I had applied the skin algorithm while the joints were in their correct positions (as shown left), after moving their initial locations, they were still controlling the same vertices. I was then able to create a precise form that could be directly mapped to the physics base. This gave me the results that I was after, and was the largest ‘aha!’ moment of the whole project — finally seeing Pablo dancing instead of some particles was rewarding.

Another difficulty with the particles was transferring their rotation values to the joints, as the particles themselves never actually rotate. This was solved by looking at two particles at a time and taking the angle between them.

The code that was responsible for this first started with me defining which particles were to be referenced for their position and rotation.

var points = [
neck.bodies[9],
neck.bodies[5],
neck.bodies[1],
head.bodies[4],
];
var rotPoints = [
[neck.bodies[8], neck.bodies[9]],
[neck.bodies[4], neck.bodies[5]],
[neck.bodies[0], neck.bodies[1]],
[head.bodies[0], head.bodies[4]],
];

Then on every frame, transfer these positions to each of the bones.

for (var i = 0; i < bones.length — 1; i++) {
bones[i].position.x = points[i].position.x;
bones[i].position.y = -points[i].position.y;
bones[i].rotation.z = getAngle(rotPoints[i][0], rotPoints[i][1]);
};

Challenge 3: Animating Pablo’s facial expressions

To give Pablo the power to express himself, we needed to find a way to animate his facial expressions. Jono had illustrated a range of emotions, so I needed to find a way to interchange between them.

I used a canvas texture, allowing me to redraw the eyes and mouth separately as necessary, without having to load the entire image for each combination.

Firstly we create our canvas in javascript.

var canvasMat = document.createElement(‘canvas’);
var ctxMat = canvasMat.getContext(‘2d’);
canvasMat.width = 1024;
canvasMat.height = 1024;

Then convert this canvas to a Threejs Texture.

var canvasTexture = new THREE.Texture(canvasMat);

Then create a Threejs Material, which we can apply directly to our mesh. Note that skinning must be enabled to allow the bone displacement to be visible.

var canvasMaterial = new THREE.MeshBasicMaterial({
map: canvasTexture,
skinning: true,
transparent: true,
});

Here is the list of images that were used on Pablo.

var textureImages = [
‘pablo.png’, //0
‘pablo-eye_angry.png’, //1
‘pablo-eye_open.png’, //2
‘pablo-eye_cute.png’, //3
‘pablo-eye_dazed.png’, //4
‘pablo-eye_look.png’, //5
‘pablo-eye_rock.png’, //6
‘pablo-eye_shock.png’, //7
‘pablo-mouth_frown.png’, //8
‘pablo-mouth_smile.png’, //9
];

Below is the function that takes care of updating the canvas. It requires two arguments, the index of the eye and mouth images in reference to the array order above.

var drawCanvas = function(eye, mouth) {
ctxMat.clearRect(0, 0, 1024, 1024);
ctxMat.drawImage(loadedImgs[0], 0, 0, 1024, 1024);
  ctxMat.drawImage(loadedImgs[eye], 637, 100, 100, 137);
ctxMat.drawImage(loadedImgs[mouth], 310, 100, 313, 313);
  canvasTexture.needsUpdate = true;
}

I first clear the canvas, draw the base image, then draw the chosen eye and mouth in the specific regions. Finally I flag that the Threejs texture has updated.

So for example, if I wanted to show Pablo with an angry eye and frown, it would be as simple as calling the following.

drawCanvas(1, 8);

Challenge 4: Interacting with Pablo

I really wanted the user to be able to interact with Pablo directly, playing with the physics and moving him around. I also thought it would be hilarious that, upon the user pausing the music, Pablo would become distraught, restart it himself, and then continue to dance. In a way this was giving some life to the character, who clearly doesn’t enjoy you breaking his groove.

The first step is to be able to know when the user clicks on Pablo. Inside Threejs there is a Projector object that allows you to project a ray into the scene upon mouse click and tell you if and where it intersected with any objects. Unfortunately, if you use a SkinnedMesh — which was the case with Pablo — the rig displacement is applied in the shader, so the intersection doesn’t take into account the final placement of Pablo on the screen. Poop.

Many thanks to @nicoptere who suggested this solution — which was render the current frame to texture (like taking a snapshot) and then check the pixel data against the mouse coordinates to know if Pablo was clicked on or not.

Below you can find a Github Gist that I made for this exact technique. It’s the very same that I used for Pablo.

https://gist.github.com/gordonnl/5bf38741a35e6ecce332

On line 31, the query will return true or false if the click was on Pablo or not. Although this is great, it doesn’t really tell us where you clicked on the object, which is the downfall of this technique. In order to overcome this, I simply looked if the click was made in the top or bottom half of the screen, and then applied a contraint to either the head or neck. This wouldn’t work in all cases, but did just fine for pablo.

I’m very aware that Pablo is breakable upon some outrageous interactions — and although this is undesirable, it’s just another limit of the technique I put in place — plus it’s kind of funny anyway.

Fin

So there you have it! A lot of insight to one of the most fun and rewarding projects that I’ve been a part of. Thanks again to my collaborators, Pascal Van der Haar and Jono Yuen.

As a final note, we wanted to let you know that this project was not at all initiated by the World Wide Fund for Nature — Pascal has been a big fan for a long time and thought it would give more meaning and be very suitable to add a link for adopting a flamingo. I’m still unsure if they’re even aware of the project, but I like to imagine that they’re sitting around scratching their heads as to why 50,000 flamingos have just been adopted.

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.