Ocean School — Technical Case Study

AKFN Studio
Nov 29, 2018 · 9 min read

Introduction

In early 2018, we partnered with the National Film Board of Canada and Dalhousie University to build an educational platform for students across Canada in order to help them better understand the ocean and our relationship with it. To do so, we faced multiples challenges:

  • Raise the awareness of students through interactive contents like 360 videos and games.
  • Provide a platform that could be experienced both at school and at home.
  • Create an experience that could work on a variety of devices available in schools (mainly tablets such as iPads or Chromebooks).

1. Building the platform

The architecture of Ocean School is broken down into units that can be identified as specific locations on the globe. Each unit then contains different modules with a focus question for students to concentrate on. Finally, these modules are populated with several hotpoints, listing multiple activities such as 360 videos, interactive games, newsreels; these are referred to as learning objects.

Although we’ve built a custom Javascript toolset that answer the needs of the variety of projects we work on, this specific project required a different approach. We required stack that would be scalable as the project would be brought to grow in the future (more units, more modules). We were also looking for something that was easily reusable because of the similar behaviours and UIs elements (buttons, overlays, infoboxes…) across the platform. A component-based framework seemed to be the solution. We finally choose React because of the quick learning curve when already working with ES6 classes, the performance focus roadmap, and the worldwide community.

React comes with a handy local management system, but we needed something more global, particularly for the complex authentication flow of the app. Focusing on schools, Ocean School introduces a system of crew, which is a group of students with their own NFB accounts linked under one name and one flag (more on that later). The experience is also available as a single NFB account. We used Redux to handle the complex data flows across the app.

For styling, we used a mix of styled components and Sass based on a benchmark completed on the different actors in the world of CSS-in-JS.

With NFB’s team of editors regularly shooting and write new content, we needed an easy-to-use CMS where React would then fetch the data. Having used and enjoyed Craft CMS on a few previous projects, we used it with the Element API extension for a true headless CMS that could provide different endpoints for data consumption. To speed up the iterative frontend process, we made it as generic as possible so that new fields in the CMS would automatically be available in the endpoints.

At the very beginning of the experience, users are invited to create their crew and custom flag, based on different properties: shape, patterns, icon and colours.

The flag creator is built upon SVGs first drawn into an offscreen Canvas 2D. The canvas is then used as a texture applied on a cloth simulation built in WebGL with the THREE.js library. This technique allow us to render detailed illustrations on high pixel-ratio devices. It also made an easy fallback for people without WebGL as we only had to render the source Canvas 2D.

The background of the whole app is built with THREE.js and a particle system, which fallbacks nicely to DOM elements and Javascript animations when WebGL is not available or disabled.

Background with WebGL enabled
Background with WebGL disabled

2. Crafting the worlds of Ocean School

One of the bigger challenges of the project was developing the crossroads of an unit, small worlds where students would be able to choose a particular subject of matter. We needed the world to be visually attractive and also easily declinable knowing we’d have many modules to deal with later.

The worlds needed to run smoothly on low-end devices with schools not being well equipped with the latest tablets.
The first thing to keep in mind was keeping the total count of triangles to the minimum possible, avoiding the multiplication of textures across the scene, and reducing the number of draw calls.

Once the base of the world done, we created tools for our design team so they could be able to quickly iterate on the rendering of the terrain by testing different textures and height-maps directly in the browser.

Heightmap and diffuse textures used to generate terrains

Once the textures designed, the production team was able to download a generated geometry of the terrain to avoid having to read an height-map on runtime in the vertex shader.

For vegetation, we provided another tool allowing the design team to handle positions, densities, and two types of tree. We encoded, in the red channel, the different areas where trees could grow, density in the green channel (the greener, the most dense the area would be) and the blue channel would define the probability of the type of tree (255 on the blue channel means a probability of 1 for the second type of tree, 128 would be a probability of 0.5).

Trees map used to generate vegetations

Once again, to avoid scripting on runtime, we downloaded a generated JSON of data listing all the positions of the trees which would later be instanced as a THREE.Geometry for THREE.Points.

For hotpoints, we have two representations, a flag and a buoy, depending on wether the hotpoint is on land or not. Flags are procedurally generated through an association of THREE.js primitives like planes and cylinders, and the buoy geometry were modeled into Cinema4D and then tinted directly in the fragment shader. Both types of hotpoints are then animated in the vertex shader to create the sensation of floating in the wind or on water.

To improve performance, we use instanced geometries to render multiple objects in a single draw call.

Flag vertex shader

//offset attribute for each instance
attribute vec3 aOffset;

Buoy vertex shader

// offset attribute for each instance
attribute vec3 aOffset;

Similar techniques were used to render the birds, which are simply two triangles animated in the vertex shader and colourized in the fragment shader.

After applying all of these techniques, we noticed we still had not reached a perfect and stable framerate. After much research, textures seemed to be the issue. We first tried to reduce their number by merging textures together when we could and recomputing uvs directly in the vertex shader.

4 clouds textures merged in a single one
attribute float aTextureIndex; // either 0, 1, 2, 3

Finally, we reached our target FPS by using a compressed texture format (PVRTC) on iOS devices, created with the open-source texture-compressor made by Tim Van Scherpenzeel.

As a final touch, we added custom objects into each world like a boat, a whale or a lighthouse. In the end, the worlds look like this:

Final worlds

3. Going deeper underwater

One of the learning objects is called Bay Watch. The goal of this game was to show the impact of agriculture and oyster aquaculture on the ocean by modifying the levels of both and seeing the effects by diving into underwater simultaneously.

The above ground part reused the techniques applied on the worlds, with controls that made the field grow or changed the count of oysters platforms in the water.

The field was made with a particle system and a texture representing the different stages of growth, randomly computed on control changes.

Different stages of field growth in a single texture

Oyster platforms are a combination of THREE.js geometry primitives merged and exported as a single geometry file to avoid scripting on load and cows are low poly objects. Both are then instanced to decrease the total count of draw calls.

The underwater world was the most challenging part of the game. We had to render algaes like eelgrass and ulva, fish, rocks, and light rays which would evolve as the user changed the controls.

Eelgrass and the quantity needed was the most demanding part. We used a simple triangle as a geometry and instanced geometries. However the instancing blew up the ability of THREE.js to use viewing frustum implementation to improve performance. We finally implemented a custom one combined with a pool of instances to display the most eelgrass possible in the viewport while removing the ones behind the user. You can read more about the whole process on this twitter thread:

For the fish, we were able to display two different species, mummichogs and sticklebacks, using the same base geometry, instanced with two different textures and custom random ranges for scale and speed attributes.
Swim animations are computed directly in the vertex shader, based on the techniques explained by Matt Nava in his talk about the game ABZÛ.

Fish vertex shader

uniform float maxZ; // maximum z position on the model
uniform float minZ; // minimum z position on the model

Movements of each instance in the water are a basic physic simulation (acceleration, velocity, position) applied every X milliseconds to the fish position.

Pseudo code of the movement of a fish instance

let rotation = new THREE.Matrix4();
let orientation = new THREE.Quaternion();
let velocity = new THREE.Vector3();
let timeElapsed = 0;

The game can also be enjoyed in VR and mobile, with a main player controlling the ground data and other players connecting through socket.io to dive underwater on a Google Cardboard.

Underwater world

Conclusion

Ocean school is without a doubt the biggest project ever completed by our studio and we are extremely proud of the results. There is no better feeling than watching kids in class having fun 😀. You can discover the trailer here:

We would love to hear your feedback so give it a try!

https://oceanschool.nfb.ca/

AKFN Studio

Written by

Studio de design et d'expériences interactives. Design and interactive experience studio.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade