2K Followers
·
Follow

How We Built a Playful WebGL Experience for 100 FWA Wins

Image for post
Image for post

By Amélie Rosser, Senior Creative Developer, Jam3

To commemorate 100 FWA wins, we created a WebGL experience to celebrate this milestone with artistic expression and fun.

We asked ourselves, how can we tell Jam3’s story through meaningful and playful interactions?

To answer that we spent a week as a group brainstorming and concepting together to formulate an idea which we would then pitch to our Creative Directors.

During our concepting phase we came up with three pillars that would define the tone of this experience:

  • Playfulness
  • Curiosity
  • Gratitude

We divided the experience up into three sections, past, present and future.

Past

The Past section creatively explored Jam3's beginnings.

Image for post
Image for post
A mood board for Past

Present would show how we’ve evolved as a company.

Image for post
Image for post
A mood board for Present

Our future is unknown, unwritten, but we definitely see it as something bright, artistic and beautiful.

Image for post
Image for post
A mood board for Future

Early Prototypes

Prototypes are a key part of the discovery phase, they help inform us on what works and what doesn’t. With playfulness being our main driver for the interactions, our aim was to create something fun and unexpected for the user.

Image for post
Image for post
Image for post
Image for post
Image for post
Image for post
Image for post
Image for post
Image for post
Image for post
Image for post
Image for post
Image for post
Image for post

Framework

We used our internal React and WebGL framework as the foundation for the website. This allows our frontend and creative developers to work together using a unified codebase.

Technical challenges

Each interaction brought its own set of unique challenges.

For the Past interaction we wanted to have a rock that would shatter into thousands of crystals.

Rather than using standard geometry shapes we opted to create our own. We used a generative PyMEL script in Maya to create high and low poly variations of the rock. The low poly (1920 triangles) version would be used in WebGL and the high poly (1966080 triangles) for generating normal and ambient occlusion maps in Mudbox.

Image for post
Image for post

With the model made and the textures generated the next step was to shatter the rock in real time. To achieve this we created another mesh composed of an instanced tetrahedron. Each tetrahedron would be aligned with the faces of the rock geometry.

Image for post
Image for post

When the user draws on the surface of the rock we deform the mesh realtime. We used an offscreen canvas that paints brush strokes where the user touches the surface of the mesh. The rock’s vertex shader would offset the vertex along the normal to extrude the rock outwards.

To create a more jagged surface we sample the ambient occlusion texture, using the brightness to mask out areas where the crystals will and won’t extrude.

Image for post
Image for post
Image for post
Image for post

The last step was to shatter the rock. Since the vertices of the rock are changed dynamically in the rock’s vertex shader we needed to re-compute the positions of the instanced tetrahedron.

To get those positions we rendered the rock with a height shader into a 128x128 render target. The vertex positions were packed from absolute world positions into rgb values from 0–1.

Image for post
Image for post
Image for post
Image for post

To extract the positions from the render target we looked up the coordinates by uv position, and then remapped the range back into world coordinates.

Image for post
Image for post
Image for post
Image for post

We really wanted something playful and tangible for the present interaction. Our early experiments used 2D physics engines such as Planck.js. However these tests proved too limiting and our search continued.

Image for post
Image for post

We came across Oimo.js, a lightweight 3d physics engine. It didn’t take long until we discovered distance joints, a joint in which two objects can be connected together. After we applied some spring to the joint, the spheres would naturally collide creating a tangible motion.

Image for post
Image for post

To make this interaction as performant as possible we used instanced sphere geometry. Each sphere had unique attributes for the size and material type.

With this approach we created a single material that featured 4 material types, shadowmaps and realtime reflections.

Image for post
Image for post
Image for post
Image for post

To add a little bit of realism we enable realtime reflections on higher end devices.

Since it would be too costly to render reflections from each spheres position we opted to render the scene from the center into a cubemap.

The downside to this, would be that every sphere shares the same reflection.

Image for post
Image for post

To work around this we used a lambert algorithm to prevent reflections showing far away from the scene center. The white areas on the spheres would reflect the cubemap.

Image for post
Image for post
Image for post
Image for post
Image for post
Image for post
The final reflection
Image for post
Image for post

Our initial vision for the future interaction was to connect dots together, each time a new dimension would appear orthogonal to the user.

This however lacks the immersion and depth the previous two interactions had to offer.

Image for post
Image for post

After brainstorming we came up with another idea. What if the user were to fly through a vortex of space into the future. Naturally this worked well since the FWA 100 logo features an infinity symbol. With that in mind we needed to create the geometry.

The first step in creating this geometry was generating a normalised point set of the logo. We wrote a javascript function that would plot a series of 2D points. This point set could then be used as a custom formula for threejs tube buffer geometry.

Image for post
Image for post

The infinity geometry was then extruded using the formula.

Image for post
Image for post

When working with transparent materials in WebGL it’s likely that you’ll encounter depth issues. This is because of the way WebGL sorts objects in depth. Transparent objects require a different rendering approach. Threejs handles this well in most situations but there still remains cases with self transparency.

When rotating the camera we noticed a visual bug with the overlapping tubes of the infinity logo.

Image for post
Image for post

To work around this we created two meshes, one with BackSide culling and the other FrontSide culling as mentioned in this article.

To add some extra sparkle we created a particle system that flows parallel to the infinity geometry.

Image for post
Image for post

The user can sway the particles around when they touch down on the screen.

Image for post
Image for post
Image for post
Image for post

Post Processing

Post processing plays an important role in the overall art direction. Every render pass in our post processing stack utilizes the big triangle technique, an optimization where a single triangle is used instead of a quad. The performance advantages (outlined in this article) might seem small, but a performance increase is still highly beneficial especially when rendering at higher resolutions.

During key moments of the storytelling we applied a full screen blur pass made possible by Jam3’s fast gaussian blur shader. This allowed us to place priority on which visual elements the user should be focusing on.

Image for post
Image for post

During interaction moments we applied a depth of field blur based on threejs’ Bokeh shader.

Image for post
Image for post

To optimise this slightly more we only applied the depth of field to the edges of the screen using a vignette algorithm. This way we could bypass the 41 texture2D calls that are needed to blur the texture.

Image for post
Image for post

In our final post processing pass we apply Full screen anti aliasing (FXAA) and noise. WebGL1 doesn’t support the browser’s built in anti aliasing within render targets.

Our final post processing stack looked like:

  • Transition pass (Only active during scene transitions)
  • Blur (Active for storytelling)
  • Depth of field (Active for interactions)
  • FXAA
  • Noise

Optimization

To deliver an optimal experience for the end user we used detect gpu to profile the graphics card. We could then configure graphics settings on a tier based approach.

Image for post
Image for post

Textures and models can also be optimised realtime for the device.

Image for post
Image for post

Screen resolutions can vary greatly across a variety of devices and laptops. Threejs let’s developers handle their own logic to resizing and setting the size of the webgl renderer canvas.

In order to maintain a stable frame rate throughout the experience across a variety of browser resolutions we came up with a technique that caps the maximum renderable resolution of the canvas.

Inspired by how game engines scale their render, we define a maximum resolution to render the canvas. If the browser size is greater than the maximum resolution we scale up the canvas.

Since we specify graphics settings per gpu tier, we can set different maximum render sizes.

Image for post
Image for post

Pre-rendering scenes

It was important for us to have a seamless experience from start to finish. We noticed when meshes would show for the first time during a transition frame rate drops would occur. The reason being threejs will only upload geometry buffers, shader programs and textures when a mesh is rendered.

We came across an approach outlined in this article that was easy to adapt into our framework.

During the loading phase of the website, we render all the 3d scenes into a render target to ensure the gpu has cached the necessary data.

The transition from past to present became a lot smoother.

Image for post
Image for post
Image for post
Image for post

Final thoughts

Creating an experimental WebGL experience like FWA100 is always a dream project for us at Jam3. We have a lot of freedom, so it’s important to figure out early the right approach. We are our worst critics at the end of the day, so team feedback and reassurance is highly valuable.

The technical approaches we took from this project will help shape future WebGL based experiences.

We hope you enjoy the experience as much as we did making it!

If you have any questions, don’t hesitate to reach out. To check out more of our work, go to Jam3.com.

Written by

Jam3 is a design and experience agency that partners with forward-thinking brands from around the world. To learn more, visit us at www.jam3.com

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store