Behind the scenes of We Cargo

Luigi De Rosa
Oct 28, 2019 · 5 min read

At EPIC we just released, a website dedicated to an upcoming conference on the world of cargo industry & innovation.

The website contains some nice parts and I thought would be interesting to share some tech insights.

DOM to WebGL

I don’t know if I would define it as a trend, but more and more websites are mixing WebGL with classic DOM/CSS to have more control over the graphic pipeline. This allows more freedom and creative opportunities that would otherwise be impossible or result in a less performant experience.

We used threejs for the WebGL part, although we’re mainly using it as an abstraction layer. We also used other utilities like bidello. You can check this GitHub repo for more informations.

Example of some hover effects

The approach used is fairly simple:

  • Code the page as you normally would (progressive enhancement FTW)
  • Add a full-screen fixed canvas as background
  • Track the position of the DOM elements you want to port into the WebGL world
  • Init the Meshes/Shaders and once they’re ready, and hide the original DOM elements
  • When scrolling, keep the position of DOM elements and WebGL in sync

Here’s some simplified code snippets:


We then create a kapla component. Kapla is a little library we use internally to bridge the gap between DOM and JS. In this case we’re mainly using for its MutationObserver implementation.


We then use a little class to register / unregister the dom->gl bindings. See how our data-type will load the Button class.


We also have our Button class that extends dom3D, which in turn extends threejs Object3D 🤯

(Notice how the material and the geometry are outside the instance, so that we can leverage some optimisation and reduce WebGL state changes)


Finally, there’s the dom3D class, which is the parent class used by all the elements. The component() mixin is from bidello , and it basically enhances the class, automatically calling the methods onResize and onRaf when necessary.


The main magic happens inside of the updateSize, updatePosition and onRaf functions. Those methods make sure the WebGL element is exactly the same size and position of the DOM element.

The calculateUnitSize on the PerspectiveCamera will calculate the necessary width and height (in unit size) of an element (at position vec3(0, 0, 0)) to completely fill the camera.

We also built some utilities for loading and caching textures and stuff like background-size: cover implementation in glsl.

The overall technique seems to be working quite well. The built-in threejs frustum culling does its job, and performance is ok.

One problem though, if you incur frame drops during scrolling, you’ll notice that the DOM elements are scrolling smoothly but not the ones in WebGL. That’s because in case of frame drop, the browser tends to prioritise its own UI instead of the JS execution. We “solved” this by having a “virtual scroll”, so that we can make sure the two parts are always in sync.

Even though this technique has its own limitations and accessibility issues, it opens a whole universe of creative possibilities. We might even do a more robust and reusable solution for the future.

🎥 Here’s only the WebGL part with a free perspective camera

How cool would be if browsers would expose more low level API over their internal graphic pipeline?


The whole brand identity of the event, presents a lot of “waves/glitches/datamoshing” effects. We wanted to animate those lines as background instead of just using a static image.

Image for post
Image for post

Here’s the solution we’ve ended up using:

In Photoshop, the Filter->Stylize->Wind->Blast does a very similar effect, so starting from a linear UV sampling, we can add this effect and end up with this texture:

Image for post
Image for post
Final UV sampler map

Then we use this UV texture to lookup a static texture.

To animate it, instead of using this static UV texture, we’ll use a framebuffer object. As this interactive background can appear on multiple parts of a page, the FBO (frame buffer object) is shared among those.

You can check the source code for the FBO helper here.

And here’s the fragment shader.

The technique is very similar to gpu/pixel sorting

The final effect + mouse trail and postprocessing


Image for post
Image for post
The particle trails on background

We create 256 points with a random position, then we store them in a 256x1 framebuffer, where those points are then animated. rgb values represent the 3D position of each particle and are stored in a floatType texture. The y position (or g channel) is incremented, and when it exceeds 5.0 it goes back to -5.0 and so on.

Image for post
Image for post
The FBO looks something like this

Then we draw the actual points with Points (which is actually just a Mesh with gl.POINTS as drawType). Instead of drawing directly on the screen, we draw them in a 512x512 FBO.

We have points moving, but now we need the trails. We create another FBO that acts as a buffer that adds the new frame each time, and reduces the opacity of the previous one.

Finally we can display our FBO. Note how the texture was just 512x512, but as we’re using LinearFilter to draw it full screen, the result is still good.


Before rendering the WebGL scene on the screen, a postprocessing pass is applied.

Here are the few effects used:

  • Glitch effect (initially based from this codepen)
  • A curve displacement based on scroll speed
  • Chromatic aberration based on mouse trail
  • Film grain

Mouse trail

The mouse trail is used for the chromatic aberration and displacement in postprocessing, but it also enhances the background waves we saw earlier.

The 256x256 FBO mouse trail

Here’s how the code looks like for the mouse trail:

A circle is drawn at mouse position with different intensity based on mouse speed. Then the trail “fades” up with time.

Here’s how the FBO is used for the RGB offset:

Page transitions

For the page transition we used barba.js with a transition that uses both CSS clipping and the WebGL glitch used as postprocessing.

Page transition


EPIC is a Belgian digital agency made of passion and…

Luigi De Rosa

Written by

code at @epicwebagency. Creative development, animations, JavaScript and WebGL.


EPIC is a Belgian digital agency made of passion and sorcery that proudly delivers bewitching projects since 2009.

Luigi De Rosa

Written by

code at @epicwebagency. Creative development, animations, JavaScript and WebGL.


EPIC is a Belgian digital agency made of passion and sorcery that proudly delivers bewitching projects since 2009.

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store