Book Image

GLSL Essentials

By : Jacobo Rodriguez
Book Image

GLSL Essentials

By: Jacobo Rodriguez

Overview of this book

Shader programming has been the largest revolution in graphics programming. OpenGL Shading Language (abbreviated: GLSL or GLslang), is a high-level shading language based on the syntax of the C programming language.With GLSL you can execute code on your GPU (aka graphics card). More sophisticated effects can be achieved with this technique.Therefore, knowing how OpenGL works and how each shader type interacts with each other, as well as how they are integrated into the system, is imperative for graphic programmers. This knowledge is crucial in order to be familiar with the mechanisms for rendering 3D objects. GLSL Essentials is the only book on the market that teaches you about shaders from the very beginning. It shows you how graphics programming has evolved, in order to understand why you need each stage in the Graphics Rendering Pipeline, and how to manage it in a simple but concise way. This book explains how shaders work in a step-by-step manner, with an explanation of how they interact with the application assets at each stage. This book will take you through the graphics pipeline and will describe each section in an interactive and clear way. You will learn how the OpenGL state machine works and all its relevant stages. Vertex shaders, fragment shaders, and geometry shaders will be covered, as well some use cases and an introduction to the math needed for lighting algorithms or transforms. Generic GPU programming (GPGPU) will also be covered. After reading GLSL Essentials you will be ready to generate any rendering effect you need.
Table of Contents (13 chapters)

The Graphics Rendering Pipeline

In accordance with the programmable pipeline diagram, I'll describe, in a summarized way, the module that the data goes through to explain how it is transformed in every stage.

Geometry stages (per-vertex operations)

This block of stages focuses on the transformation of vertex data from its initial state (model coordinates system) to its final state (viewport coordinates system):

  • Vertex data: This is the input data for the whole process. Here we feed the pipeline with all the vectorial data of our geometry: vertices, normals, indices, tangents, binormals, texture coordinates, and so on.

  • Textures: When shaders showed up, this new input for the vertex stage was possible. In addition to making our renders colorful, textures might serve as an input in vertex and geometry shaders, to, for example, displace vertices according with the values stored into a texture (displacement mapping technique).

  • Vertex shader: This system is responsible for the transformation of the vertices from their local coordinate system to the clip space, applying the adequate transform matrices (model, view, and projection).

  • Geometry shader: New primitives could be generated using this module, with the outcome of the vertex shader as input.

  • Clipping: Once the primitive's vertices are in the so-called clipping space, it is easier and computationally cheaper to clip and discard the outer triangles here rather than in any other space.

  • Perspective division: This operation converts our visualization volume (a truncated pyramid, usually called a frustum) into a regular and normalized cube.

  • Viewport transform: The near plane of the clipping volume (the normalized cube) is translated and scaled to the viewport coordinates. This means that the coordinates will be mapped to our viewport (usually our screen or our window).

  • Data is passed to the rasterizer: This is the stage that transforms our vectorial data (the primitive's vertices) to a discrete representation (the framebuffer) to be processed in further steps.

Fragment stages (per-fragment operations)

Here is where our vectorial data is transformed into discrete data, ready to be rasterized. The stages inside the superblock controls show that discrete data will finally be presented:

  • Fragment shader: This is the stage where texture, colors, and lights are calculated, applied, and combined to form a fragment.

  • Post fragment processing: This is the stage where blending, depth tests, scissor tests, alpha tests, and so on take place. Fragments are combined, tested, and discarded in this stage and the ones that finally pass, are written to the framebuffer.

External stages

Outside the per-vertex and per-fragment big blocks lies the compute shader stage. This stage can be written to affect any other programmable part of the pipeline.

Differences between fixed and programmable designs

It is worth understanding the fixed pipeline, because the programmable pipeline is heavily based on it. Shaders only replace a few well defined modules that previously existed in a fixed way, so the concept of a "pipeline" has not actually changed very much.

In the case of the vertex shaders, they replace the whole transform and lighting module. Now we have to write a program that can perform equivalent tasks. Inside your vertex shader, you can perform the calculations that you would need for your purposes, but there is a minimum requirement. In order not to break the pipeline, the output of your shader must feed the input of the next module. You can achieve this by calculating the vertex position in clipping coordinates and writing it out for the next stage.

Regarding fragment shaders, they replace the fixed texture stages. In the past, this module cared about how a fragment was produced by combining textures in a very limited way. Currently, the final outcome of a fragment shader is a fragment. As implicitly said before, a fragment is a candidate to a pixel, so, in its most simple form, it is simply an RGBA color. To connect the fragment shader with the following pipeline's modules, you have to output that color, but you can compute it the way you want.

When your fragment shader produces a color, other data is also associated to it, mainly its raster position and depth, so further tests such as depth or scissor tests could go straight on. After all the fragments for a current raster position are processed, the color that remains is what is commonly called a pixel.

Optionally, you can specify two additional modules that did not exist in the fixed pipeline before:

  • The geometry shader: This module is placed after the vertex shader, but before clipping happens. The responsibility of this module is to emit new primitives (not vertices!) based on the incoming ones.

  • The compute shader: This is a complementary module. In some way, this is quite different to the other shaders because it affects the whole pipeline globally. Its main purpose is to provide a method for generic GPGPU (General-Purpose computation on GPUs); not very graphics related. It is like OpenCL, but more handy for graphics programmers because it is fully integrated with the entire pipeline. As graphic usage examples, they could be used for image transforms or for deferred rendering in a more efficient way than OpenCL.