Understanding the Three.js Transmission example

Franky Hung
Geek Culture
Published in
5 min readNov 18, 2021
screenshot of the cute striped bubble demonstrated in the official example

The official examples hosted on threejs.org are good references of most of the visual effects beginners are looking to achieve. But some times you might find them hard to grasp as there aren’t direct explanations about what is going on in the code. I also need to do ample research before I can understand most of the code in this official example demonstrating the transmission effect through the physically based material.

In this article, I will try to explain the most important concepts or code used in the striped air bubble example:

  • RGBELoader
  • THREE.EquirectangularReflectionMapping and envMap property
  • THREE.ACESFilmicToneMapping
  • THREE.sRGBEncoding
  • the transparent and striped texture

RGBELoader

According to Wikipedia, RGBE also stands for Radiance HDR. RGBE stores images with more colours and details than what the normal 24-bit per pixel format can do, which means RGBE images have a higher dynamic range.

Upon further research, though I couldn’t find an explicit long form for RGBE, I believe the ‘E’ stands for exponent. This is because RGBE basically is a 32-bit per pixel format, with the first 24 bits storing the 3 bytes of the 3 primary colours, and the last 8 bits storing a shared exponent, which essentially is a common scale factor for the colour channels. This format takes advantage of the fact that the magnitudes of the various colours are usually highly correlated and close, thus a shared exponent can support a high dynamic range with a compact storage size without having to store separate exponents for each colour.

The individual channels are jointly scaled by the following formula, which also applies to G and B by swapping R:

Reference: https://www.cl.cam.ac.uk/~rkm38/pdfs/artusi2017hdr_column.pdf

E in this formula is the shared exponent stored as the last 8 bits of the RGBE format.

After some background knowledge session, you should be clear by now why the example uses the RGBELoader to load the background scene, and that’s simply because the loaded image is an hdr file.

THREE.EquirectangularReflectionMapping and envMap

The hdr texture loaded by the RGBELoader in the code is an equirectangular image. This hdrEquirect texture is set as the envMap property of the MeshPhysicalMaterial of the striped bubble, and also as the skybox of the scene via this line:

scene.background = hdrEquirect;

Quoting from the official docs on Texture Constants, an equirectangular map is:

“Also called a lat-long map, an equirectangular texture represents a 360-degree view along the horizontal centerline, and a 180-degree view along the vertical axis, with the top and bottom edges of the image corresponding to the north and south poles of a mapped sphere.”

But to make this skybox and environment reflection map magic work, you have to set the mapping property of the hdrEquirect texture to use THREE.EquirectangularReflectionMapping , which is just a constant of some number defined in the three.js code. Without this, the skybox would simply be a static display of the whole equirectangular image, and there would be no environment reflection on the bubble.

THREE.ACESFilmicToneMapping

In general, tone mapping is a technique used in computer graphics to map HDR color values into LDR when the display device is not capable of showing HDR colors. In Three.js, we can choose from a few tone mapping settings and ACESFilmicToneMapping is just one of them; see the Tone Mapping section on the Renderer page.

Since an HDR image is used as the scene background and the material’s environment map, tone mapping is used. To be honest, I still don’t understand all the behind-the-scenes stuff in Three.js, like what this specific ACESFilmicToneMapping will do to the colors loaded from the HDR file, but I can do a comparison of having tone mapping on and off in the scene for you.

ACESFilmicToneMapping is on for the left image, while off for the right

THREE.sRGBEncoding

If you have worked with Three.js long enough, you should come across this line of setting very often:

renderer.outputEncoding = THREE.sRGBEncoding;

This tells the renderer to transform the final color value of each fragment to the sRGB color space. The reason why we want the final output in sRGB space, is that sRGB is the de-facto standard for modern electronic devices, including monitors, cameras, scanners, etc… To have a better understanding on the topic, check out this in-depth article by John Novak which also helped me in the first place.

Most of the time when we have to deal with image textures, we normally have to set texture.encoding = THREE.sRGBEncoding as well because most of the images online are sRGB-encoded. Setting this explicitly allows Three.js to correctly decode color values on these textures from the sRGB space to the linear space before doing any further calculations on it. This is important because lighting equations, post-processing or many other shader code assume color values in linear space. For simple projects, you might not see the difference, but for larger and more complex scenes, you will find that the colors aren’t right if you ignore this sRGB workflow(check out the answer by Mugen87).

You might wonder then, why don’t we need to set hdrEquirect.encoding = THREE.sRGBEncoding in this example. That is because things are handled differently for HDR files. According to the migration guide from 130–131, the default type of textures loaded with RGBELoader, HDRCubeTextureLoader and EXRLoader is now THREE.HalfFloatType , and that means texture.encoding is set internally in the RGBELoader if you take a look at the load function of the RGBELoader’s source code.

The striped-transparent texture

In order to make the bubble in the example to have transparent stripes, a repeated black and white striped texture is applied to the alphaMap property of the MeshPhysicalMaterial . The generateTexture function generates a square canvas element where only the lower part of it is painted with white. If you simply want to have a look at how the texture would normally look like, you can create a new MeshBasicMaterial to load that texture only, and apply this basic material to the sphere instead:

const material2 = new THREE.MeshBasicMaterial({map: texture})

Thus, white color in an alphaMap basically means 100% opaque(the transparency in the example is additionally introduced by the transmission and transparent properties on the PBR material) and black means 100% transparent or invisible.

Conclusion

I have spent almost 2 weeks in learning all these new concepts before I can even write this down and publish this. I’m sure what I have written here isn’t authoritative in any sense so I usually paste links to references as I go. Most of the time, I write because it not only helps me to consolidate and strengthen my knowledge, but also motivates me to learn new cool stuff so that I can share them with my audience.

Hope you’ve learnt something new too!

Further Reading on Gamma Correction

  1. https://www.cambridgeincolour.com/tutorials/gamma-correction.htm
  2. https://www.wildlifeinpixels.net/blog/gamma-encoding/

--

--

Franky Hung
Geek Culture

Founder of Arkon Digital. I’m not a code fanatic, but I’m always amazed by what code can do. The endless possibilities in coding is what fascinates me everyday.