VR Diaries 0001: The Shadow of Shaders

A gentle summary to what i have learned during the last few weeks for rendering, GPGPU and the demoscene.

Conceptualizing Unity as my VR Middle Earth, I have spent the last three weeks in Mordor aka Shader-land.

A shader is a program designed to be run on a renderer as part of the rendering operation. Regardless of the kind of rendering system in use, shaders can only be executed at certain points in that rendering process… Shaders for OpenGL are run on the actual rendering hardware…. This can often free up valuable CPU time for other tasks, or simply perform operations that would be difficult if not impossible without the flexibility of executing arbitrary code. A downside of this is that they must live within certain limits that CPU code would not have to.
OpenGL

e.g BTW, please be aware.I am just an urban data explorer not a developer :) (i will create a proper logo for this notification )

Part 1: The fellowship of the rendered

SHADER LAND

/chapter{ There and back again}

My vain ambitions, mistakenly led me to deal with Shaders. Exiled from the peaceful lands where scripting objects into the scene is all that you can do, Shaders are working buried deep down in your GPU. They are in charge of instructing your graphic processor to render specific packages of data.

At the moment the easiest way to program a Shader is using a weird language called HLSL( High-Level Shading Languagee). As you might already have imagined, this kind of sorcery is a bit hard to be found across the internet and only few manuscripts or tutorials are available for new apprentices.

Before I start walking in these unexplored lands, I had to study some basics first. The rendering pipeline.

Rendering Pipeline (NTU)

For every shiny 3D object that lives on your screen, the graphic processor needs to deal with a process called rendering. For this procedure, geometry information is translated to screen coordinates in the form of a long 2D color array that stores each pixel’s color value.

Taking for example a cube, the GPU requires the coordinates for the 8 vertexes and the triangles which will form each surface of the cube. The declaration of each Triangle follows the rule of the righ thumb controlling the direction that the rendered surface will face.

Declaration of the triangles for the front facing surface of the cube. The triangle strip will be for this purpose ABCD. ( Source of the Image and helpful material by Dhaval Rathod)

Another necessary property is the UVs, an array of 2D coordinates that controls the way that the texture will be applied or wrapped around the object.

Of course your fancy GTA VIIII scene doesn’t rely only on these simple steps and a lot of optimization is needed in order to produce the final composition. Additional actions are implemented in order for the GPU to compute the level of detail for each object according to the distance from the camera, the reflections, the shadows, the lighting, the transparency and the texture of more complex materials. And if you actually want to know more about the GTA, Doom and DeusEx rendering pipeline you might find interesting Adrian 
Courrèges blog
.

Interesting enough Shaders can not be used only for visualizing meshes but also for controlling their animation. That was the reason that Shaders were a major hit in the early days of computer graphics giving life to the demo scene. It was possible to create music videos by hard coding the mesh animations into executable files that were limited to a floppy disk size, equivalent to 50kb.

\chapter{ Footsteps in the dark}

By unlocking the power of the GPU, it is possible to compute complex calculations in less time. Combining fragment, pixel and compute Shaders these calculations might target from image recognition to data analysis problems.

While the use of Shaders might improve significantly your processing time, they were not designed for general uses (general purposes on graphic processing units- GPGPU is an entire domain of study) and the information which they can handle, is limited to a specific type and size of data structures.

A way to achieve a mesh movement using Shader is to remap the key frames of a vertex animation into a vector4 color array (RGB for the position and Alpha for the time). Then the animation can be produced by interpolating the values from one key frame to the next one for each mesh. The final result..millions of points can be rendered in real time while the frame rate remains high.

We use a texture to store the animation data because the amount of data required for all animation frames is too large to fit into constant memory.
GPU Gems

So back to Minas Tirith for research.

What looking for Shaders’ Tutorials is it like.

Reading the old scrolls , i was unravelling a story of hidden power . A great introduction made by Patricio Gonzalez Vivo and Jen Lowe was a website called The Book of Shaders. Furthermore with the help of a YouTube series tutorials made by World of Zero I was able to create my first animated point cloud shader.

/chapter {Finally we are moving}

Keeping in mind that my main goal was to visualize urban data, using Shaders I remembered that I have already used a similar tool with a library called Deck.gl. This handy library, developed by UBER, is able to wrap geo located data from JSON files directly to your shader. In this way, more than 900k Uber trips are able to be visualized directly on your screen in real time (for Deck.gl visit thsi tutorial in Noah’s Mad Lab )

Uber’s Deck.gl

Following some advises from Yi Fei Boon and Jonah Norberg I produced a quick prototype using Processing (code available here). In the figure below you can see the animation in action. The texture has a width equal to the number of the individual agents while the height represents the animation frames. A simple query updates the position of the agents reading the texture row by row in a FrameCount%height rate.

Reading position from texture . Unity (left) Fast prototyping in Processing (left)

I am leaving you with the encoding trips_to_texture image that I have used to create my vertex .

After two weeks of dealing with Shaders.

/section{YES BUT WHY ?}

Urban stuff coming..

With the latest increase of the available urban data, it is important for us to explore new(or old ) tool sets that can produce more efficient visualisations. Our ultimate goal is to make possible for any user to interact or to observe millions of discrete ;) spatial units in a real time.

and always stay #.