# Into Vertex Shaders part 1: The Spaces of WebGL

Jun 18, 2017 · 4 min read

This is the first in a series of articles about advanced WebGL animation using Three.js and Three.bas, my extension for complex and highly performant animation systems.

When I started working with 3D graphics development, one of the initial hurdles I had to overcome was understanding how the different coordinate systems through out the 3D graphics pipeline fit together.

This brief overview will disambiguate these spaces, helping you not to get lost along the way.

# Pixel Coordinates

The first coordinate system is the one we are all most familiar with: pixel coordinates. It starts in the top left corner of the screen at `{x: 0, y: 0}`, and moves right and down to `{x: viewportWidth, y: viewportHeight}`.

While familiar, you will rarely be working with pixels directly. In fact, conversion to actual pixels on screen is the very last step in the 3D graphics pipeline. If you want a 3D object to show up at a specific location on screen, you will need to use trigonometry to calculate the corresponding position in 3D space first. I may come back to this in a later post, but for now let’s forget that pixels exist at all.

# Normalized Device Coordinates

The second coordinate system is called Normalized Device Coordinates, or NDC for short. It starts in the center of the screen at {x: 0, y: 0}, with the x axis moving right and the y axis moving up. The range of values on both axes is -1 to 1. NDC is the same for all screens; whether you are using a tiny phone or an oversized flat screen, the values will not change.

If you have had to add pointer interaction to a WebGL project, you probably used a method like this:

`window.addEventListener('mousemove', function(e) {  var x = (e.clientX / window.innerWidth * 2) - 1;  var y = (e.clientY / window.innerHeight * -2) + 1;    // something awesome});`

This converts the position of the pointer on screen from pixel coordinates to NDC. You need to do this because NDC is an integral part of the math behind 3D graphics.

Normalized Device Coordinates are also the final output space for shaders (which we will get to in the next post). Therefor, when WebGL developers refer to screen space, this is the coordinate space we refer to, not pixels.

# 3D Coordinates

The third coordinate system is the actual 3D coordinates, often called world space or scene space.

By convention, red is used for the x axis, green is used for the y axis, and blue is used for the z axis. The color channels correspond to the axes; RGB is XYZ.

It starts in the center at {x: 0, y: 0, z: 0}. In Three.js, the x axis moves to the right (with negative values moving left), the y axis moves up (with negative values moving down), and the z axis moves towards the camera (with negative values moving away). This is called Right Handed Orientation. While common, this orientation is by no means standard. If you are importing models from 3D software, the orientation may be different. You then need to reorient the model by rotating it along one or two axes.

Other than orientation, you need to consider scale. If, for example, you create a 1 by 1 by 1 cube, what does that really mean?

The values along the axes in 3D space are essentially unitless; they can be whatever you want them to be. It is advisable to consider the scale of your world before you start building it. If you are working on human scale (cars, buildings), a 3D unit may represent a meter. If you are working with landscapes, it may represent a kilometer, and so on.

Being mindful of scale, and choosing and consistently applying the right dimensions will save you a lot of headaches down the road. If you don’t, you may find yourself having to work with values that are too big or too small to be practical. Even worse, you may have to go back and change a bunch of numbers throughout your code when things finally stop making sense.

## UV Space

Finally I want to mention UV coordinates, used to map textures to 3D models. This coordinate system starts in the bottom left corner at {u: 0, v: 0}, and moves right and up to {u: 1.0, v: 1.0}.

If you are working with models generated in 3D software, or any of the Three.js primitives, UV coordinates will be generated for you. As such, you will not have to interact with them in code all that often. They may however show up if you are working with post-processing, where textures are used to transfer data between subsequent effects.

As mentioned before, Normalized Device Coordinates represent the output space for shaders. 3D coordinates (alongside UV coordinates and a whole bunch of other data) represent the input. In the next post, we will take a detailed look at how the conversion between the two happens.

Written by