
This tutorial introducestexture mapping.
It's the first in a series of tutorials about texturing in GLSL shaders in Blender. In this tutorial, we start with a single texture map on a sphere. More specifically, we map an image of the Earth's surface onto a sphere. Based on this, further tutorials cover topics such as lighting of textured surfaces, transparent textures, multitexturing, gloss mapping, etc.


The basic idea of “texture mapping” (or “texturing”) is to map an image (i.e. a “texture” or a “texture map”) onto a triangle mesh; in other words, to put a flat image onto the surface of a three-dimensional shape.
To this end, “texture coordinates” are defined, which simply specify the position in the texture (i.e. image). The horizontal coordinate is officially calledS and the vertical coordinateT. However, it is very common to refer to them asx andy. In animation and modeling tools, texture coordinates are usually calledU andV.
In order to map the texture image to a mesh, every vertex of the mesh is given a pair of texture coordinates. (This process (and the result) is sometimes called “UV mapping” since each vertex is mapped to a point in the UV-space.) Thus, every vertex is mapped to a point in the texture image. The texture coordinates of the vertices can then be interpolated for each point of any triangle between three vertices and thus every point of all triangles of the mesh can have a pair of (interpolated) texture coordinates. These texture coordinates map each point of the mesh to a specific position in the texture map and therefore to the color at this position. Thus, rendering a texture-mapped mesh consists of two steps for all visible points: interpolation of texture coordinates and a look-up of the color of the texture image at the position specified by the interpolated texture coordinates.
In OpenGL, any valid floating-point number is a valid texture coordinate. However, when the GPU is asked to look up a pixel (or “texel”) of a texture image (e.g. with the “texture2D” instruction described below), it will internally map the texture coordinates to the range between 0 and 1 in a way depending on the “wrap mode”. For example, wrap mode “repeat” basically uses the fractional part of the texture coordinates to determine texture coordinates in the range between 0 and 1. On the other hand, wrap mode “clamp” clamps the texture coordinates to this range. These internal texture coordinates in the range between 0 and 1 are then used to determine the position in the texture image: specifies the lower, left corner of the texture image; the lower, right corner; the upper, left corner; etc. The OpenGL's wrap mode corresponds to Blender's settings underProperties > Texture tab > Image Mapping. Unfortunately, Blender doesn't appear to set the OpenGL wrap mode but it is always set to “repeat”.
To map the image of the Earth's surface to the left onto a sphere in Blender, you first have to download this image to your computer: click the image to the left until you get to a larger version and save it (usually with a right-click) to your computer (remember where you saved it). Then switch to Blender and add a sphere (in anInfo window chooseAdd > Mesh > UV Sphere), select it in the3D View (by right-clicking), activate smooth shading (in theTool Shelf of the3D View, presst if it is not active), make sure thatDisplay > Shading: GLSL is set in theProperties of the3D View (pressn if they aren't displayed),and switch theViewport Shading of the3D View toTextured (the second icon to the right of the main menu in the3D View). Now (with the sphere still being selected) add a material (in aProperties window > Material tab > New). Then add a new texture (in theProperties window > Textures tab > New) and selectImage or Movie for theType and clickImage > Open. Select your file in the file browser and click onOpen Image (or double-click it in the file browser). The image should now appear in the preview section of theTextures tab and Blender should put it onto the sphere in the 3D View.
Now you should make sure that theCoordinates in theProperties window > Textures tab > Mapping are set toGenerated. This means that our texture coordinates will be set to the coordinates in object space. Specifying or generating texture coordinates (i.e. UVs) in any modeling tool is a whole different topic which is well beyond the scope of this tutorial.
With these settings, Blender will also send texture coordinates to the vertex shader. (Actually, we could also use the object coordinates ingl_Vertex because they are the same in this case.) Thus, we can write a vertex shader that receives the texture coordinates and hands them through to the fragment shader. The fragment shader then does some computation on the four-dimensional texture coordinates to compute the longitude and latitude (scale to the range from 0 to 1), which will be used as texture coordinates here. Usually this step would be unnecessary since the texture coordinates should already correctly specify where to look up the texture image. (In fact, any such processing of texture coordinates in the fragment shader should be avoided for performance reasons; here I'm only using this trick to avoid setting up appropriate UV texture coordinates.) The Python script to set up the shader could be:
importbgecont=bge.logic.getCurrentController()VertexShader=""" varying vec4 texCoords; // texture coordinates at this vertex void main() { texCoords = gl_MultiTexCoord0; // in this case equal to gl_Vertex gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; }"""FragmentShader=""" varying vec4 texCoords; // interpolated texture coordinates for this fragment uniform sampler2D textureUnit; // a small integer identifying a texture image void main() { vec2 longitudeLatitude = vec2( (atan(texCoords.y, texCoords.x) / 3.1415926 + 1.0) * 0.5, 1.0 - acos(texCoords.z) / 3.1415926); // processing of the texture coordinates; // this is unnecessary if correct texture coordinates // are specified within Blender gl_FragColor = texture2D(textureUnit, longitudeLatitude); // look up the color of the texture image specified // by the uniform "textureUnit" at the position // specified by "longitudeLatitude.x" and // "longitudeLatitude.y" and return it in "gl_FragColor" }"""mesh=cont.owner.meshes[0]formatinmesh.materials:shader=mat.getShader()ifshader!=None:ifnotshader.isValid():shader.setSource(VertexShader,FragmentShader,1)shader.setSampler('textureUnit',0)
Note the last line
shader.setSampler('textureUnit', 0)
in the Python script: it sets the uniform variabletextureUnity to 0. This specifies that the texture should be used which is first in the list in theProperties window > Textures tab. A value of 1 would select the second in the list, etc. In fact, for eachsampler2D variable that you use in a fragment shader, you have to set its value with a call tosetSampler in the Python script as shown above. Actually, asampler2D uniform specifies the texture unit of the GPU. (A texture unit is a part of the hardware that is responsible for the lookup and interpolation of colors in texture images.) The number of texure units of GPUs is available in the built-in constantgl_MaxTextureUnits, which is usually 4 or 8. Thus, the number of different texture images available in a fragment shader is limited to this number.
If everything went right, the texture image should now appear correctly mapped onto the sphere when you start the game engine by pressingp. (Otherwise Blender maps it differently onto the sphere.) Congratulations!
Since many techniques use texture mapping, it pays off very well to understand what is happening here. Therefore, let's review the shader code:
The vertices of Blender's sphere object come with attribute data ingl_MultiTexCoord0 for each vertex, which specifies texture coordinates that are in our particular example the same values as in the attributegl_Vertex, which specifies a position in object space.
The vertex shader then writes the texture coordinates of each vertex to the varying variabletexCoords. For each fragment of a triangle (i.e. each covered pixel), the values of this varying at the three triangle vertices are interpolated (see the description in“Rasterization”) and the interpolated texture coordinates are given to the fragment shader. In this particular example, the fragment shader computes new texture coordinates inlongitudeLatitude. Usually, this wouldn't be necessary because correct texture coordinates should be specified within Blender using UV mapping. The fragment shader then uses the texture coordinates to look up a color in the texture image specified by the uniformtextureUnit at the interpolated position in texture space and returns this color ingl_FragColor, which is then written to the framebuffer and displayed on the screen.
It is crucial that you gain a good idea of these steps in order to understand the more complicated texture mapping techniques presented in other tutorials.
You have reached the end of one of the most important tutorials. We have looked at:
If you want to know more