Local Info

Topics

j3d.org

Basic GLSL Shader Example

Using shaders within Aviatrix3D can be quite simple or complex, depending on your needs. Today, fancy shaders use many different capabilities of the graphics pipeline, including multipass rendering and render-to-texture. For this initial example, we'll concentrate just on getting a simple shader set up and running without using any of the more complex inputs. During this example, we are assuming you know how to use and write shaders using GLSLang and want to know how to integrate them with Aviatrix3D. If you don't know anything about them, then we recommend you grab a copy of the new book by Randi Rost called The OpenGL Shading Language. This book is a compliment to the traditional Red Book and Blue Book, and is published by the same company and has an orange cover, hence earning the nickname of The Orange Book. The "hello world" equivalent for shaders seems to be the basic brick shader, so that will be what we'll use here. If you would like to see the complete version of this code, it can be found in the examples/shader directory and is named OBBrickDemo.java because we've used the same code from The Orange Book.

 

Setting up the Scene Graph

Shaders are applied to any sort of geometry. Frequently applications will want to set up geometry in a specific way, but for this introductory example, we really don't care about what we are rendering onto. So, to keep the code simple, we're just going to set up a very simple scene graph that puts a quad on screen for the shader to render to. There will be a shader to attach to the quad, but that's the only difference from our basic hello world example code presented elsewhere.

Overview of how OpenGL shaders work

In the OpenGL model of shaders, there are (currently) two types of shaders - vertex and fragment. The two types have different requirements about the source code - builtin variables that are available, required outputs etc. In both cases, the shader is provided as strings of text, which the OpenGL driver then compiles and links at runtime. Where those strings come from, OGL does not care. Compiling and linking are explicit steps required by OGL and allows you to build up collections of partial shaders and build a final working shader. Passing a shader source string to the driver does not automatically invoke a compile, or link.

Each set of source strings for a specific shader is known as a Shader Object in OpenGL parlance. This roughly resembles a Texture object - it's a set of basic encapsulated state. An object represents one type of shader - vertex or fragment. In fact, it may represent one part of one shader - for example if you have a library of partial effects (eg physics functions) that you want to share between multiple complete shading effects.

After you have assembled a number of shader objects, you need to pull them all together into a single shader program. The shader program is what is applied to our geometry. The program must have at least one shader object, but does not require any more than that. A simple shader can be just a fragment shader, or just a vertex shader. You are not required to have at least one of each in the program.

Once you have the assembled program, you are now ready to apply it to geometry. Of course, a shader will probably want to have some customisablity, so you will want to provide some external values to the shader to control its behaviour. A shader program is separate from the arguments that can be provided to it. A phong shader, for example, just does the lighting effects, but the arguments will tell it what base colours to combine together to render it. So while you have the phong shader used by two objects, you will want to provide two different sets of arguments - say one is red and the other is blue. Alternatively, you may wish to provide information on a per-vertex basis, rather than per object. In the first case, the variables are called uniform and in the later attributes.

Overview of how Aviatrix3D models GLSLang shaders

We separate each set of functionality into a set of classes that isolates and maps to each capability described above. The goal of our design has been to provide as much flexibility as straight OpenGL allows, while still maintaining scene graph-style objects for reuse resource sharing.

Because there are at least 3 different ways of defining shaders in an OpenGL environment (GL 1.4 Vertex/Fragment programs, GLSLang shaders for GL 1.5/2.0 and nVidia's Cg) we start with a common base class to represent all progammable shader capabilities at the scene graph level. This class, Shader provides a single hook that can be registered in the scene graph without needing to pollute the container classes with language-specific requirements. The class itself is empty, as there is nothing common between them, other than a specific place in the scene graph to place it.

For the GLSLang shader we have a class derived from Shadernamed, appropriately enough, GLSLangShader that encapsulates all of the concepts described above. This class holds two classes that represents the actual shader program ShaderProgram and the uniform variables that can be assigned to it ShaderArguments. A distinction is made between the program and it's arguments because it is expected to be a common case where a single program is used to shade many objects but each object uses different values for the arguments. This structure allows us to keep the number of programs to a minimum, while still providing a lot of customisation for individual uses.

Each program is then made of a collection of objects that represent a single set of shader source code. This class is the ShaderObject and wraps a single set of source strings that are compiled to make the program. As required by the underlying GL specification - each program requires at least one object before it can be used. Another requirement of the class is that you nominate what type of object the source strings represent. This is performed at construction time and takes one of the predefined constants that correspond to the equivalent OpenGL types.

Shaders are used to control the visuals of the object, and as such can be found under the Appearance node. A single method is provided that allows you to register a generic shader object with the setShader(Shader) method. This same method is used to register any of the other shader language implementations as well.

Assembling the Shader code

This example assumes that we have a scene graph already in place and would like to use a shader. As mentioned earlier, we are going to make use of the simple brick shader to control a quad. There is two source files for this shader - one each for a vertex and fragment shader. These need to be loaded into the application, so the first bit of code you need to write needs to read a file into a String. That's a fairly trivial task and looks like this:
private String loadFile(String name) {
    File file = new File(name);
    if(!file.exists()) {
        System.out.println("Cannot find file " + name);
        return null;
    }

    String ret_val = null;

    try {
        FileReader is = new FileReader(file);
        StringBuffer buf = new StringBuffer();
        char[] read_buf = new char[1024];
        int num_read = 0;

        while((num_read = is.read(read_buf, 0, 1024)) != -1)
            buf.append(read_buf, 0, num_read);

        is.close();

        ret_val = buf.toString();
    } catch(IOException ioe) {
        System.out.println("I/O error " + ioe);
    }

    return ret_val;
}

After loading the file into a string, you need to place it into an appropriate ShaderObject instance. Firstly you need to construct an instance of the appropriate type, nominating it as either a vertex or fragment shader. Since you also need to load a file to go with it, then you have to make sure you load the right file for the object.

A requirement of the ShaderObject is that the source comes in the form of an array of String. This is because OpenGL allows you to define a shader object with a collection of strings all at once. For example you may want to load a number of different files and use them as a single object instance. In this example we're only loading a single file, so the following code can be used to load the file and pass it to our object instance for a vertex shader:

    String[] vert_shader_txt = { loadFile(VTX_SHADER_FILE) };

    ShaderObject vert_shader = new ShaderObject(true);
    vert_shader.setSourceStrings(vert_shader_txt, 1);
    vert_shader.compile();
The last line here tells the object that it should compile itself when next possible. Since OpenGL requires a GL context to perform the compilation, it is likely that there will be some signifcant time between when you request the compilation and when it actually happens. In particular, if the node is not part of a live scene graph, it will never be compile under the covers. This is just a strong hint to the code that it should be compiled at the next available oppourtunity.

Once you have your shader objects created, you need to assemble them into a program object. To do that, register each object instance with the ShaderProgram instance using the addShaderObject() method. With the object instances registered, you will need to then link them together to form the completed shader program that is ready for use with the GL pipeline.

    ShaderProgram shader_prog = new ShaderProgram();
    shader_prog.addShaderObject(vert_shader);
    shader_prog.addShaderObject(frag_shader);
    shader_prog.link();
Next on your list of things to do is set some arguments for the brick shader. To do this, you need to create an instance of ShaderArguments, as well as the values that you want to pass to the shader in the form of uniform variables. For the majority of cases uniform values are defined as arrays. Even if you are passing a single value of a float or int to the shader, Aviatrix3D requires you to create an array. Internally Aviatrix3D copies the arrays that you have given it for the uniform values, so you're free to reuse a single array instance between calls if that suits your application purposes. For clarity here we are creating one array per value to be set:
    float[] brick_colour = { 1, 0.3f, 0.2f };
    float[] mortar_colour = { 0.85f, 0.86f, 0.84f };
    float[] brick_size = { 0.3f, 0.15f };
    float[] brick_pct = { 0.9f, 0.85f };
    float[] light_pos = { 0, 0, 4 };

    ShaderArguments shader_args = new ShaderArguments();
Next you need to set these values with the object we've just created. There are a number of methods that are available that overload the setUniform() name. Each of these take an array instance to copy the data from, the size of the data and the number of items of that size to copy from the array. The size of the data indicates the dimension of the vector that you are passing to GLSLang. For example, you can use a float, vec2, vec3 or vec4 data type for floating point values, and thus the size value would be 1, 2, 3 or 4, respectively. The count parameter then just specifies how many of these that you want to set. For this shader, we are only setting single vector values so that parameter is always 1.

    shader_args.setUniform("BrickColor", 3, brick_colour, 1);
    shader_args.setUniform("MortarColor", 3, mortar_colour, 1);
    shader_args.setUniform("BrickSize", 2, brick_size, 1);
    shader_args.setUniform("BrickPct", 2, brick_pct, 1);
    shader_args.setUniform("LightPosition", 3, light_pos, 1);
For info on what each of these values represent, please consult the chapter in the Orange Book for further information.

Just as a note, there are specialised versions of setUniform() for dealing with matrix and uniform types. If you are passing these values through to your shader, make sure that you use the appropriate method for the data otherwise odd things will happen.

All the hard work is now done. Just create a hold object instance and register it with your appearance node instance and your shader will be running.

    GLSLangShader shader = new GLSLangShader();
    shader.setShaderProgram(shader_prog);
    shader.setShaderArguments(shader_args);

    Appearance app = new Appearance();
    app.setShader(shader);
That's it. The basics of interacting with shaders is now complete. When running the demo, you should now see a brick-textured quad appear on screen.

Interacting with shaders

After getting the basic shaders running, the next task is to interact with them by changing values on the fly. The basic structure for this you are already familiar with - using a scene graph observer, registering node update callbacks and then sending through the changes at the appropriate time.

Reading log information

When developing shaders, debugging information is crucial. Once you've finished using standalone tools like 3DLabs GLSLValidator for verifying the basic syntax is fine, you'll then need to start using them within the scene graph. Of course you'll find even more errors, so you'll need to make use of OpenGL's provided logging capabilities to find out what the drivers think of your code.

Our logging API seems a little odd, but this is driven by the way OpenGL has designed their logging. To understand why, we need to drop back into the OGL world quickly. OpenGL generates logging information that follows the normal state-based rules. The log is provided after something has executed. If nothing changes the log information on that frame, then it remains available. What this also requires is that we have an OpenGL context to work with for fetching the log. Since the only place that the underling JOGL API allows us access to a GL context is during the rendering callback, we have to use a deferred logging system. That is, the user asks Aviatrix3D for the most current log, and it will queue up the query for processing the next time we have a rendering callback. The log string is fetched and placed in the requesting shader class for viewing any time after that. Finally, in keeping with the philosophy of the OpenGL API, we don't delete the log message until the next time the user requests the log for that object.

Because of this requirement forced on us, fetching the log from the shader is a two-frame operation; the first frame queues up the log request, and the second has the string that contains the log message available for display. Requesting the latest log be fetched is through the requestLastInfoLog() method, which is available on both ShaderProgram and ShaderObject. This method may be called during the main update callback and would look something like this:

public class MyApp implements SceneUpdateObserver
{
    private ShaderObject vertexShader;

    private boolean logFetched = false;

    public void updateSceneGraph()
    {
        if(need_to_fetch_log)
        {
            vertexShader.requestLastInfoLog();
            logFetched = true;
        }
        else if(logFetched)
        {
            System.out.println("Last message: " + vertexShader.getLastInfoLog());
            logFetched = false;
        }

    }
}
Getting the last log that was created is handled through the getLastInfoLog() method, which is also available on both ShaderProgram and ShaderObject. Note how in the example we use a flag to indicate whether we should be fetching the log or if was requested last frame. This gives us the one frame delay described above.

Changing Uniform Variables

Uniform variables can be changed through the normal listener callbacks. Since uniform values just represent data, you need to register a data changed callback when you wish to change a variable. For example, in our brick shader from earlier, we would like to change the brick colour. Firstly, register a callback with the ShaderArguments that was originally created and now we have pushed to being a global variable.
public class MyApp implements SceneUpdateObserver, NodeUpdateObserver
{
    private ShaderArguments brickArgs;

    public void updateSceneGraph()
    {
        brickArgs.dataChanged(this);
    }
}
In the callback method, updateNodeDataChanges(), just make your changes like you were setting the value for the first time.
    public void updateNodeDataChanges(Object src)
    {
        float[] new_brick_color = ....

        brickArgs.setUniform("BrickColor", 3, new_brick_colour, 1);
    }
Now, each frame your brick colour can be updated with new values as needed.

Setting and Changing Attribute Variables

Attribute values are variables that change on a per-vertex basis, compared to uniform values, which change per object. Setting and changing attributes is via methods on VertexGeometry, rather than any of the shader classes. Since these represent data, rather than bounding information, you need to use a data changed callback.

To set or change the vertex attribute values, you make use of one of the forms of setAttributes(). Each method takes an index of the attribute, a size of the attribute data (same definition as for uniform variables), the array of data and a normalisation flag. There are separate overloaded methods for each data type that OpenGL allows - bytes, shorts, ints, floats and doubles. Each of the integer types can also be marked as signed or unsigned. The normalisation flag defines whether OpenGL should normalise the length of each vector in the provided array to a unit length. There's no ability to provide a separate length field, because the length is assumed to match that of the number of currently valid vertices provided.

Determining the correct index value to use comes from the way that GLSLang interfaces work. OpenGL allows us to tell it what index should be used for a specific attribute name. Binding this index to a name is performed through the bindAttributeName() method on the ShaderProgram class. For example, we have a segment of GLSLang code that is representing a per-vertex temperature for colour shading of the output:

attrib vec3 temperature

varying vec3 temperatureColor

void main() {
    temperatureColor = temperature;
}
Now we want to set these values through our geometry:
public class MyApp
{
    private ShaderProgram temperatureShader;
    private VertexGeometry heatedObject;

    ...

    private void setupSceneGraph()
    {
        ...
        temperatureShader.bindAttributeName("temperature", 1);
    }

    private void calculateTemps()
    {
        float[] temps = new float[heatedObject.getValidVertexCount() * 3];

        // set some temperature values here...

        heatedObject.setAttribute(1, 3, temps, false);
    }
}
You only need to bind an attribute value to an index once in the life of the shader, there's no need to do it each time you want to change values. If you are needing to pass matrix values to the shader, we follow the same process as what OpenGL requires - namely that you use 2, 3 or 4 consective index values to pass in the individual rows of the 2x2, 3x3 or 4x4 matrix values.

Gotchas for using Shaders in Scene Graphs

One of the more interesting parts of how OGL works with shaders is that the compile and link process is asynchronous. When you pass the source strings to the driver to compile and/or link, it does not happen immediately. In a scene graph environment, this can cause us quite a number of headaches as one of the fundamental assumptions is that anything added or changed in the scene graph is assumed to be available immediately. You will need to keep this in mind as you work with more complex examples and how these interact with the scene graph and rendering architecture. We have made some allowances for this as you'll see in the Javadoc, but it's something to be mindful of when writing large scale applications.

Another issue to be very careful of is the interaction between the basic geometry, the vertex shader and any view frustum culling. Vertex shaders can, and often do, shift the basic vertex of the geometry. However, the view frustum culling has to work with the raw data from before the vertex shader has messed with it. For example, you start with a flat grid of points and use the vertex shader to generate a fractal landscape. The implicit bounding box used by the input geometry is almost a flat plane. While that implicit bounding box stays within the view frustum, you'll see everything correctly. However, the moment that implicit bounds is entirely out of the view frustum, the whole lot will be removed from further processing, and the shader code will never get executed. That is, your fractal terrain suddenly just disappears for no apparent reason. To prevent this from happening, make sure that you set explicit bounds for your geometry that would represent the maximal extents that the vertex shader is likely to perturb the vertices to. In this way, nothing gets culled from view while any of that volume intersects with the view frustum.

Vertex attributes have a special index of zero. If you set an attribute at index 0, then OpenGL assumes this to be actual vertex data rather than just generic attribute values. If you set both an attribute at index 0 and the vertex values, one of the two will be ignored. Since one of the assumptions of Aviatrix3D's geometry is that you will always set vertex values to validate the amount of data you're setting for everything else, then also providing a set of attributes for index 0 will most likely cause problems.