Local Info

Topics

j3d.org

Texturing Basics Example

Textures are a core part of every graphics applications. They can be used to provide direct visual effects, indirect (such as environment mapping) or even as inputs to programmable shaders. thus, it is important to be able to use textures efficiently in any graphics API and understand the myriad of options available. Aviatrix provides such an extensive range of texturing capabilities that it can't all be covered in a single tutorial. This one is aimed at introducing just the basic concepts, while the advanced capabilities are given their own treatment. If you would like to see the complete version of this code, it can be found in the examples/basic directory and is named TextureDemo.java.

 

Setting up the application

The texture demo starts with just a slight variation of the Hello World demo. Instead of a triangle for the geometry, we will create a square so that you can see the complete texture.

101 Varieties of Textures

One of the nice things about texturing is that there are so many options to choose from. At the C level, OpenGL provides just one way of handing texture data to the rendering pipeline - through an array of ints, floats or bytes. At the Java level, there are many different ways of providing texture data - based on how it was loaded. A user could use the ImageIO libraries, URL content handlers, through the AWT Toolkit.createImage() calls or even dynamically generate one and place it in an array of bytes. The one thing that you can count on is that the way you want to load image data is not the one that an API provides.

With Aviatrix3D, we've tried to cover as many bases as possible. One of the things that frustrated us most with Java3D was that textures were required to use the Java Image system. A lot of the time we wanted to generate dynamic textures and the overheads associated with the AWT image system was prohibitive - particularly for offscreen rendering techniques. At the same time, we wanted a consistent set of APIs regardless of how you created the texture or where you are using it.

Textures are divided into two sets of classes - data holders, which hold the source for the pixels, and the rendering management object. By making this separation we can separately control how to provide the raw pixels, yet keep a single set of objects that are responsible for rendering. Classes that hold pixel data are extended from the TextureComponent object, while the renderable objects are derived from Texture.

For each texture, you have the option of defining 1D, 2D or 3D textures, along with cubic environment maps and offscreen (pBuffer) textures. For the source data, you can provide it as either Java AWT BufferedImages or as a raw byte array. The extensibility of the system would allow you to define other types if you desire too, these are the stock options available. The names of the source TextureComponent derived classes make it easy to tell what type they handle ByteTextureComponent1D defines a 1D texture source that defines it's data as a byte[], while ImageTextureComponent3D defines a 3D texture source that uses images as the input source.

Loading a Texture

There are many different ways to get a texture into the system from the raw data so this example will show just one option. The first thing that you need to do is to acquire the raw data that you want to render. Whether that source is a procedurally generated texture using some Perlin noise function or loaded from an external file, it does not matter.

Here we are going to make use of the ImageIO libraries from JDK 1.4. Start by loading an instance of the image into the application using the normal routines

TextureComponent2D img_comp = null;

try
{
    File f = new File("textures/test_image.png");
    if(!f.exists())
	{
        System.out.println("Can't find texture source file");
		return;
    }

    FileInputStream is = new FileInputStream(f);

    BufferedInputStream stream = new BufferedInputStream(is);
    BufferedImage img = ImageIO.read(stream);

	....
}
catch(IOException ioe)
{
    System.out.println("Error reading image: " + ioe);
	return;
}

Note that the image that is read is an instance of BufferedImage. Aviatrix3D is actually a little more flexible than this - so long as the source is an instance of RenderedImage it will be fine (though right now the implementation only handles BufferedImage).

After the image has been loaded, you need to create the wrapper for it with the appropriate TextureComponent derived type. Since we have an Image instance, that means we want to make use of ImageTextureComponent2D. Creating an instance of this class gives you a wide range of options. If you already have the image loaded, like we do in this example, then the simplest option is to use the constructor that takes the format and image reference only:

img_comp = new ImageTextureComponent2D(format, img);

Working out the format is a somewhat optional excercise since for the most part it is automatically overridden based on the information from the supplied image source. If the source is an instance of BufferedImage then the ColorModel is extracted and an appropriate format is set. The one time where your supplied format value is honored is with an image that only has a single colour component (ie grayscale with no alpha channel). Because the texture application may want to treat this as either an intensite map or an alpha map, the scene graph needs you to tell it which way to interpret the supplied pixel values.

With the TextureComponent instance in hand, it's time to create the Texture object to contain it. As we have a 2D texture component, that means we need to create a matching texture object, thus requiring Texture2D. Start by creating an instance of the texture object:

Texture2D texture = new Texture2D();

And then fill it in with the source data using the setImages() call. Aviatrix3D always takes arrays of source objects in the method calls as this allows for a single method to be used in many different ways. The parameters required are the texture mapping mode (whether to use mip maps, or just a single image), a texture format, an array of the source texture components to use, and a length identifier. The texture format is used to tell OpenGL how to intepret the converted format bytes provided by the TextureComponent if your texture component is using a single component colour model. For the rest, it is just ignored. Since we only have a single image loaded, we won't be running mipmaps, so the final piece of code looks like this:

texture.setImages(Texture.MODE_BASE_LEVEL,
                  Texture.FORMAT_RGB,
                  new TextureComponent[] { img_comp },
                  1);

Note that setImages() is a generic method and how the values are interpreted is dependent on the derived class' implementation of that method. Behaviour for Texture2D is different from CubicEnvironmentMap.

Applying textures to an object

The last step of the texturing setup it to connect the Texture object instance to the geometry. This is done through the use of an intermediary object known as TextureUnit, which is in turn added to the Appearance instance.

TextureUnit wraps up all the details that can be applied to a single texture object into a convenient transportable object. While this may seem like overkill, particularly compared to other scene graph APIs, it's usefulness becomes more evident later on when you use the more advanced texture capabilities. For now, there is only one method that you need to use - the one to register the Texture instance with it. Applying the TextureUnit to the appearance requires an array of them again (this is a fairly common theme in Aviatrix3D). The final code looks like this:

TextureUnit[] tu = new TextureUnit[1];
tu[0] = new TextureUnit();
tu[0].setTexture(texture);

Appearance app = new Appearance();
app.setTextureUnits(tu, 1);

And one last screenshot of what you should see if you run the demo application: