Extending three.js materials with GLSL

What are three.js materials?

Image for post
Image for post
MeshStandardMaterial extended with specular/gloss + instancing +map transformations. See demo.

What do materials do?

Absolutely everything! Almost all rendering operations are somehow tied to a material. The material holds some state that three’s WebGLRenderer uses to set the appropriate WebGL state (is blending turned on for example, or what kind of depth test should be used, if at all…). It also holds some GLSL code (shader), that is used to actually compute things or draw stuff on screen. While Material is the only wrapper around shaders, it’s not the only thing that interfaces with it.

What do shaders do?

Computation! Lot’s of it, and in parallel. Shaders are programs written in a shader language (GLSL in case of WebGL) that run at various stages of the rendering pipeline and are executed in parallel on the GPU. WebGL shaders have two stages, vertex and fragment.

What are we trying to solve?

Say we have a high level problem “show a 3d car in the browser”. With three.js this is incredibly trivial and high level. If the car is properly stored in some format that describes a scene graph (like glTF, an artist can set up lights and cameras. If the format supports a compatible three.js material (or vice versa), an artist can set how shiny the car body is, or how dull the tires are.

new Loader.load( ‘someModel.someFormat’, model => scene.add(model) )
model.traverse( obj => {
obj.receiveShadows = true
obj.castShadows = true

The research

Let’s dig into the code a bit. MeshStandardMaterial like other materials, has a core GLSL description in form of a template:

#define PHYSICAL                //GLSL  varying vec3 vViewPosition;   //GLSL#ifndef FLAT_SHADED             //GLSL  varying vec3 vNormal;         //GLSL#endif                          //GLSL#include <common>               //NOT GLSL
#include <uv_pars_vertex> //NOT GLSL
#include <uv2_pars_vertex> //NOT GLSL
diffuseColor.a *= texture2D( alphaMap, vUv ).g;

The problem

How do we modify the material? Let’s consider the naive approach first:

The copy paste route

The official way of extending a shader can be found in this example:

var myStandardMaterial = new THREE.MeshStandardMaterial()
myStandardMaterial.roughness = 1
var myExtendedStandardMaterial = new MyExtendedStandardMaterial()
myStandardMaterial.uniforms.roughness.value = 1 //interface changed

The monkey patch route

Another valid approach, some times suggested is to monkey patch the THREE.ShaderChunk dictionary. Since it is global, and it is public, one can replace any chunk in there.

//before loading your app
THREE.ShaderChunk.some_chunk = my_chunk
#ifdef MY_DEFINE  myLogic()#else 

var myDefaultMaterial = new THREE.MeshStandardMaterial()var myModifiedMaterial = new THREE.MeshStandardMaterial()
myModifiedMaterial.defines.MY_DEFINE = '' //triggers our branch

The onBeforeCompile route

What if we could tell three.js that when it picks up the template for processing (in order to generate a valid GLSL program), it doesn’t exclusively sample the chunks from that one THREE.ShaderChunk dictionary?

diffuseColor.a *= texture2D( alphaMap, vUv * 2. ).g;
var myMaterial = new THREE.MeshStandardMaterial()myMaterial.onBeforeCompile = shader => {  shader.fragmentShader = //this is the fragment program string in the template format 
shader.fragmentShader.replace( //we have to transform the string
'#include <alphamap_fragment>', //we will swap out this chunk
require('my_alphamap_fragment.glsl') //with our own
diffuseColor.a *= texture2D( alphaMap, vUv * myValue ).g;
var myMaterial = new THREE.MeshStandardMaterial()myMaterial.userData.myValue = { value: 2 } //this will be our input, the system will just reference itmyMaterial.onBeforeCompile = shader => {  shader.uniforms.myValue = myMaterial.userData.myValue //pass this input by reference

//prepend the input to the shader
shader.fragmentShader = 'uniform vec2 myValue;\n' + shader.fragmentShader
//the rest is the same
shader.fragmentShader =
'#include <alphamap_fragment>',

Some code

You can get pretty creative with onBeforeCompile. For example, you can parse the entire shader yourself and then look for patterns on a more granular level than just swapping out chunks.

Some gotchas

If you want your modified material to work with shadows, and you’ve done some kind of additional transformation of the vertices, you need to use Mesh.customDepthMaterial that has the corresponding extension ie. whatever you apply to some mesh material, you need to apply to this:

myMesh.customDepthMaterial =   new THREE.MeshDepthMaterial() 

Some thoughts

If it were possible to pass own THREE.ShaderChunk dictionary to any THREE.Material i believe it would be the most flexible solution for working with a chunk system.

const material = new THREE.MeshBasicMaterial()material.chunks.begin_normal = myChunk
const myGenericOnBeforeCompile = shader=>{
const {customUniforms, customChunks} = this.userData
shader.uniforms[uName] = customUniforms[uName]
//store `vertex` or `fragment`
let shaderStage = customChunks[chunkName].vertexStage
shaderStage = `${shaderStage}Shader`
shader[shaderStage] = shader[shaderStage].replace(
`#include <${chunkName}>`,
myMaterial.onBeforeCompile = myGenericOnBeforeCompile.bind(myMaterial)

Written by

i like computer graphics

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store