# Shaders: First Steps to GLSL

Published in

--

When I just started work with the 3D graphics I was surprised about shaders. This is a very powerful tool to manipulate a 3D object. But for me a long time ago that looked very complicated and hard to start. I got a lot of questions and no simple answers. After some time and with more experience with ThreeJS I returned to them and now I have the simple jump in the tutorial.

Let’s take an example and go through the steps to create such a thing. My favourite example is the planet texture for the small game about futuristic space wars. Many planets and space shapes of different types, colours and form exist in our universe. Just imagine the cost of painted textures for such a small game. Of course, our imagine start-up will look for another way of implementation. And the shaders are the best choice here. You might have a question “Why?”, so I cannot hide the answer from you anymore and let’s deep dive!

Shaders are written by the GLSL — OpenGL high-level shading language and they are executed on the GPU. We have two types of shaders — vertex and fragment. Vertex shaders work with vertexes of an object and define their position, size and etc. In the same time, fragment shaders work with colours, brightness and etc.

I use the ThreeJS library for this example. At first, we should declare the geometry and material of the sphere.

`const geometry = new THREE.SphereGeometry(planetSize, 65, 65);const material = new THREE.ShaderMaterial({    uniforms,    vertexShader,    fragmentShader});const sphere = new THREE.Mesh(geometry, material);scene.add(sphere);`

The first of all we declared SphereGeometry. The geometry is created by sweeping and calculating vertexes around the Y-axis (horizontal sweep) and the Z-axis (vertical sweep). ShaderMaterial is material rendered with custom shaders. We applied our custom shaders into vertexShader and fragmentShader properties. WebGLRenderer will render them properly. As a result, we defined the sphere as a new ThreeJS Mesh.

Vertex shader code is working with positions and displacement of the vertexes on the sphere. Also, we should handle with UV mapping, it is the 3D modelling process of projecting a 2D image onto a 3D model’s surface. The simplest vertex shader looks like:

`void main() { gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);}`

`void main() { gl_FragColor = vec4(0.9, 0.0, 0.0, 1.0);}`

We have a good result! The simple sphere with slow rotation. However, this too simple, what about UV? How to see it? Well, let’s update the shaders. In the vertex shader, we should add the declaration of variable what will save a UV value. In the same time, this variable will be available from fragment shader and we can change the colour of lines displaying the sphere UV with basic checks and gl_FragColor.

`varying vec2 vUv;void main() { vUv = uv; gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);}`

`varying vec2 vUv;void main() {  if ((fract(vUv.x * 10.0) < 0.02)|| (fract(vUv.y * 10.0) < 0.02)) {    gl_FragColor = vec4(vec3(0.0), 1.0);  } else {    gl_FragColor = vec4(1.0);  }}`

As we can see, the connection exists between shaders and vUv variable sends from the vertex shader to fragment shader. Also, we may send the data from JS code through the uniforms property in ShaderMaterial declaration.

At the last point of this article, I want to add the code snippets from the first image of the ice planet. Here you can find the same logic as before but including some additions.

`varying vec3 vUv;varying vec4 color;float noise(vec3 x, vec3 y) {  vec3 p = floor(x);  vec3 f = fract(x);  float n = p.x + p.y*157. + 113.*p.z;  f = f*f*(3.-2.*f);  vec4 v1 = fract(753.*sin(n + vec4(0., 1., 157., 158.)));  vec4 v2 = fract(753.*sin(n + vec4(113., 114., 270., 271.)));  vec4 v3 = mix(v1, v2, f.z);  vec2 v4 = mix(v3.xy, v3.zw, f.y);  return mix(v4.x, v4.y, f.x);}void main() {  vUv = vec3(uv, 1.0);  color = vec4(49.,49.0,49.0,1.0)/255.;  float b = 5.0 * noise(0.3 * position, vec3( 2.0 ));  float displacement = -10. * 0.2 + b;  vec3 newPositionD = position + normal * displacement; gl_Position = projectionMatrix * modelViewMatrix * vec4(newPosition, 1.0);   vUv = (vec4(newPosition, 1.0) * modelViewMatrix).xyz;}`

Fragment shader takes the output from the vertex shader and computes colour and other attributes of each “fragment”: a unit of rendering work affecting mainly a single output pixel, associated colours, the depth value of a pixel, etc. After these operations, the fragment is sent to Framebuffer for display on the screen.

`varying vec3 vUv;uniform float radius;varying vec4 color;const mat3 m = mat3(0.00, 0.80, 0.60, -0.80, 0.36, -0.48, -0.60, -0.48, 0.64);float noise(vec3 x, vec3 y) {  vec3 p = floor(x);  vec3 f = fract(x);  float n = p.x + p.y*157. + 113.*p.z;  f = f*f*(3.-2.*f);  vec4 v1 = fract(753.*sin(n + vec4(0., 1., 157., 158.)));  vec4 v2 = fract(753.*sin(n + vec4(113., 114., 270., 271.)));  vec4 v3 = mix(v1, v2, f.z);  vec2 v4 = mix(v3.xy, v3.zw, f.y);  return mix(v4.x, v4.y, f.x);}void main() {  vec4 fragColor = vec4(0.9, 0.0, 0.0, 1.0);  vec3 fragCoord = vec3(vUv.x, vUv.y, vUv.z) / radius;  vec3 src = vec3(fragCoord.x, fragCoord.y, fragCoord.z) * 1.;   vec3 colorGradient = mix(vec3(26., 38., 83.)/255., vec3(111., 163., 181.)/255., vec3(196., 255., 148.)/255.);  float f = 0.0;  vec3 q = 8.0*src;  f = 0.4 + 0.5000*noise(q); q = m*q*2.01;  f += 0.2500*noise(q); q = m*q*2.02;  f += 0.1250*noise(q); q = m*q*2.03;  f += 0.0625*noise(q); q = m*q*2.01;  fragColor = vec4(colorGradient + f, 1.0);  gl_FragColor = fragColor;}`