Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikibooksThe Free Textbook Project
Search

Cg Programming/Unity/Lighting of Bumpy Surfaces

From Wikibooks, open books for an open world
<Cg Programming |Unity
“The Incredulity of Saint Thomas” by Caravaggio, 1601–1603.

This tutorial coversnormal mapping.

It's the first in a series of tutorials about texturing techniques that go beyond two-dimensional surfaces (or layers of surfaces). Note that these tutorials are meant to teach you how these techniques work. If you want to actually use one of them in Unity, you should rather use a built-in shader or aSurface Shader.

In this tutorial, we start with normal mapping, which is a very well established technique to fake the lighting of small bumps and dents — even on coarse polygon meshes. The code of this tutorial is based onSection “Smooth Specular Highlights” andSection “Textured Spheres”.

Perceiving Shapes Based on Lighting

[edit |edit source]

The painting by Caravaggio depicted to the left is about the incredulity of Saint Thomas, who did not believe in Christ's resurrection until he put his finger in Christ's side. The furrowed brows of the apostles not only symbolize this incredulity but clearly convey it by means of a common facial expression. However, why do we know that their foreheads are actually furrowed instead of being painted with some light and dark lines? After all, this is just a flat painting. In fact, viewers intuitively make the assumption that these are furrowed instead of painted brows — even though the painting itself allows for both interpretations. The lesson is: bumps on smooth surfaces can often be convincingly conveyed by the lighting alone without any other cues (shadows, occlusions, parallax effects, stereo, etc.).

Normal Mapping

[edit |edit source]

Normal mapping tries to convey bumps on smooth surfaces (i.e. coarse triangle meshes with interpolated normals) by changing the surface normal vectors according to some virtual bumps. When the lighting is computed with these modified normal vectors, viewers will often perceive the virtual bumps — even though a perfectly flat triangle has been rendered. The illusion can certainly break down (in particular at silhouettes) but in many cases it is very convincing.

More specifically, the normal vectors that represent the virtual bumps are firstencoded in a texture image (i.e. a normal map). A fragment shader then looks up these vectors in the texture image and computes the lighting based on them. That's about it. The problem, of course, is the encoding of the normal vectors in a texture image. There are different possibilities and the fragment shader has to be adapted to the specific encoding that was used to generate the normal map.

A typical example for the appearance of an encoded normal map.

Normal Mapping in Unity

[edit |edit source]

The very good news is that you can easily create normal maps from gray-scale images with Unity: create a gray-scale image in your favorite paint program and use a specific gray for the regular height of the surface, lighter grays for bumps, and darker grays for dents. Make sure that the transitions between different grays are smooth, e.g. by blurring the image. When you import the image withAssets > Import New Asset change theTexture Type in theInspector Window toNormal map and checkCreate from Grayscale. After clickingApply, the preview should show a bluish image with reddish and greenish edges. Alternatively to generating a normal map, the encoded normal map to the left can be imported. (In this case, don't forget to uncheck theCreate from Grayscale box).

The not so good news is that the fragment shader has to do some computations to decode the normals. First of all, the texture color is stored in a two-component texture image, i.e. there is only an alpha componentA{\displaystyle A} and one color component available. The color component can be accessed as the red, green, or blue component — in all cases the same value is returned. Here, we use the green componentG{\displaystyle G} since Unity also uses it. The two components,G{\displaystyle G} andA{\displaystyle A}, are stored as numbers between 0 and 1; however, they represent coordinatesnx{\displaystyle n_{x}} andny{\displaystyle n_{y}} between -1 and 1. The mapping is:

nx=2A1{\displaystyle n_{x}=2A-1}   and   ny=2G1{\displaystyle n_{y}=2G-1}

From these two components, the third componentnz{\displaystyle n_{z}} of the three-dimensional normal vectorn=(nx,ny,nz){\displaystyle =(n_{x},n_{y},n_{z})} can be calculated because of the normalization to unit length:

nx2+ny2+nz2=1{\displaystyle {\sqrt {n_{x}^{2}+n_{y}^{2}+n_{z}^{2}}}=1\quad \Rightarrow }  nz=±1nx2ny2{\displaystyle n_{z}=\pm {\sqrt {1-n_{x}^{2}-n_{y}^{2}}}}

Only the “+” solution is necessary if we choose thez{\displaystyle z} axis along the axis of the smooth normal vector (interpolated from the normal vectors that were set in the vertex shader) since we aren't able to render surfaces with an inwards pointing normal vector anyways. The code snippet from the fragment shader could look like this:

float4encodedNormal=tex2D(_BumpMap,_BumpMap_ST.xy*input.tex.xy+_BumpMap_ST.zw);float3localCoords=float3(2.0*encodedNormal.a-1.0,2.0*encodedNormal.g-1.0,0.0);localCoords.z=sqrt(1.0-dot(localCoords,localCoords));// approximation without sqrt:  localCoords.z =// 1.0 - 0.5 * dot(localCoords, localCoords);

The decoding for devices that use OpenGL ES is actually simpler since Unity doesn't use a two-component texture in this case. Thus, for mobile platforms the decoding becomes:

float4encodedNormal=tex2D(_BumpMap,_BumpMap_ST.xy*input.tex.xy+_BumpMap_ST.zw);float3localCoords=2.0*encodedNormal.rgb-float3(1.0,1.0,1.0);

However, the rest of this tutorial (and alsoSection “Projection of Bumpy Surfaces”) will cover only desktop platforms.

Tangent plane to a point on a sphere.

Unity uses a local surface coordinate systems for each point of the surface to specify normal vectors in the normal map. Thez{\displaystyle z} axis of this local coordinates system is given by the smooth, interpolated normal vectorN in world space and thexy{\displaystyle x-y} plane is a tangent plane to the surface as illustrated in the image to the left. Specifically, thex{\displaystyle x} axis is specified by the tangent parameterT that Unity provides to vertices (see the discussion of vertex input parameters inSection “Debugging of Shaders”). Given thex{\displaystyle x} andz{\displaystyle z} axis, they{\displaystyle y} axis can be computed by a cross product in the vertex shader, e.g.B =N ×T. (The letterB refers to the traditional name “binormal” for this vector.)

Note that the normal vectorN is transformed with the transpose of the inverse model matrix from object space to world space (because it is orthogonal to a surface; seeSection “Applying Matrix Transformations”) while the tangent vectorT specifies a direction between points on a surface and is therefore transformed with the model matrix. The binormal vectorB represents a third class of vectors which are transformed differently. (If you really want to know: theskew-symmetric matrix corresponding to “B×” is transformed like aternary quadratic form.) Thus, the best choice is to first transformN andT to world space, and then to computeB in world space using the cross product of the transformed vectors.

These computations are performed by the vertex shader, for example this way:

structvertexInput{float4vertex:POSITION;float4texcoord:TEXCOORD0;float3normal:NORMAL;float4tangent:TANGENT;};structvertexOutput{float4pos:SV_POSITION;float4posWorld:TEXCOORD0;// position of the vertex (and fragment) in world spacefloat4tex:TEXCOORD1;float3tangentWorld:TEXCOORD2;float3normalWorld:TEXCOORD3;float3binormalWorld:TEXCOORD4;};vertexOutputvert(vertexInputinput){vertexOutputoutput;float4x4modelMatrix=unity_ObjectToWorld;float4x4modelMatrixInverse=unity_WorldToObject;output.tangentWorld=normalize(mul(modelMatrix,float4(input.tangent.xyz,0.0)).xyz);output.normalWorld=normalize(mul(float4(input.normal,0.0),modelMatrixInverse).xyz);output.binormalWorld=normalize(cross(output.normalWorld,output.tangentWorld)*input.tangent.w);// tangent.w is specific to Unityoutput.posWorld=mul(modelMatrix,input.vertex);output.tex=input.texcoord;output.pos=UnityObjectToClipPos(input.vertex);returnoutput;}

The factorinput.tangent.w in the computation ofbinormalWorld is specific to Unity, i.e. Unity provides tangent vectors and normal maps such that we have to do this multiplication.

With the normalized directionsT,B, andN in world space, we can easily form a matrix that maps any normal vectorn of the normal map from the local surface coordinate system to world space because the columns of such a matrix are just the vectors of the axes; thus, the 3×3 matrix for the mapping ofn to world space is:

Msurfaceworld=[TxBxNxTyByNyTzBzNz]{\displaystyle \mathrm {M} _{{\text{surface}}\to {\text{world}}}=\left[{\begin{matrix}T_{x}&B_{x}&N_{x}\\T_{y}&B_{y}&N_{y}\\T_{z}&B_{z}&N_{z}\end{matrix}}\right]}

In Cg, it is actually easier to construct the transposed matrix since matrices are constructed row by row

MsurfaceworldT=[TxTyTzBxByBzNxNyNz]{\displaystyle \mathrm {M} _{{\text{surface}}\to {\text{world}}}^{T}=\left[{\begin{matrix}T_{x}&T_{y}&T_{z}\\B_{x}&B_{y}&B_{z}\\N_{x}&N_{y}&N_{z}\end{matrix}}\right]}

The construction is done in the fragment shader, e.g.:

float3x3local2WorldTranspose=float3x3(input.tangentWorld,input.binormalWorld,input.normalWorld);

We want to transformn with the transpose oflocal2WorldTranspose (i.e. the not transposed original matrix); therefore, we multiplyn from the left with the matrix. For example, with this line:

float3normalDirection=normalize(mul(localCoords,local2WorldTranspose));

With the new normal vector in world space, we can compute the lighting as inSection “Smooth Specular Highlights”.

Complete Shader Code

[edit |edit source]

This shader code simply integrates all the snippets and uses our standard two-pass approach for pixel lights. It also demonstrates the use of aCGINCLUDE ... ENDCG block, which is implicitly shared by all passes of all subshaders.

Shader"Cg normal mapping"{Properties{_BumpMap("Normal Map",2D)="bump"{}_Color("Diffuse Material Color",Color)=(1,1,1,1)_SpecColor("Specular Material Color",Color)=(1,1,1,1)_Shininess("Shininess",Float)=10}CGINCLUDE// common code for all passes of all subshaders#include"UnityCG.cginc"uniformfloat4_LightColor0;// color of light source (from "Lighting.cginc")// User-specified propertiesuniformsampler2D_BumpMap;uniformfloat4_BumpMap_ST;uniformfloat4_Color;uniformfloat4_SpecColor;uniformfloat_Shininess;structvertexInput{float4vertex:POSITION;float4texcoord:TEXCOORD0;float3normal:NORMAL;float4tangent:TANGENT;};structvertexOutput{float4pos:SV_POSITION;float4posWorld:TEXCOORD0;// position of the vertex (and fragment) in world spacefloat4tex:TEXCOORD1;float3tangentWorld:TEXCOORD2;float3normalWorld:TEXCOORD3;float3binormalWorld:TEXCOORD4;};vertexOutputvert(vertexInputinput){vertexOutputoutput;float4x4modelMatrix=unity_ObjectToWorld;float4x4modelMatrixInverse=unity_WorldToObject;output.tangentWorld=normalize(mul(modelMatrix,float4(input.tangent.xyz,0.0)).xyz);output.normalWorld=normalize(mul(float4(input.normal,0.0),modelMatrixInverse).xyz);output.binormalWorld=normalize(cross(output.normalWorld,output.tangentWorld)*input.tangent.w);// tangent.w is specific to Unityoutput.posWorld=mul(modelMatrix,input.vertex);output.tex=input.texcoord;output.pos=UnityObjectToClipPos(input.vertex);returnoutput;}// fragment shader with ambient lightingfloat4fragWithAmbient(vertexOutputinput):COLOR{// in principle we have to normalize tangentWorld,// binormalWorld, and normalWorld again; however, the// potential problems are small since we use this// matrix only to compute "normalDirection",// which we normalize anywaysfloat4encodedNormal=tex2D(_BumpMap,_BumpMap_ST.xy*input.tex.xy+_BumpMap_ST.zw);float3localCoords=float3(2.0*encodedNormal.a-1.0,2.0*encodedNormal.g-1.0,0.0);localCoords.z=sqrt(1.0-dot(localCoords,localCoords));// approximation without sqrt:  localCoords.z =// 1.0 - 0.5 * dot(localCoords, localCoords);float3x3local2WorldTranspose=float3x3(input.tangentWorld,input.binormalWorld,input.normalWorld);float3normalDirection=normalize(mul(localCoords,local2WorldTranspose));float3viewDirection=normalize(_WorldSpaceCameraPos-input.posWorld.xyz);float3lightDirection;floatattenuation;if(0.0==_WorldSpaceLightPos0.w)// directional light?{attenuation=1.0;// no attenuationlightDirection=normalize(_WorldSpaceLightPos0.xyz);}else// point or spot light{float3vertexToLightSource=_WorldSpaceLightPos0.xyz-input.posWorld.xyz;floatdistance=length(vertexToLightSource);attenuation=1.0/distance;// linear attenuationlightDirection=normalize(vertexToLightSource);}float3ambientLighting=UNITY_LIGHTMODEL_AMBIENT.rgb*_Color.rgb;float3diffuseReflection=attenuation*_LightColor0.rgb*_Color.rgb*max(0.0,dot(normalDirection,lightDirection));float3specularReflection;if(dot(normalDirection,lightDirection)<0.0)// light source on the wrong side?{specularReflection=float3(0.0,0.0,0.0);// no specular reflection}else// light source on the right side{specularReflection=attenuation*_LightColor0.rgb*_SpecColor.rgb*pow(max(0.0,dot(reflect(-lightDirection,normalDirection),viewDirection)),_Shininess);}returnfloat4(ambientLighting+diffuseReflection+specularReflection,1.0);}// fragment shader for pass 2 without ambient lightingfloat4fragWithoutAmbient(vertexOutputinput):COLOR{// in principle we have to normalize tangentWorld,// binormalWorld, and normalWorld again; however, the// potential problems are small since we use this// matrix only to compute "normalDirection",// which we normalize anywaysfloat4encodedNormal=tex2D(_BumpMap,_BumpMap_ST.xy*input.tex.xy+_BumpMap_ST.zw);float3localCoords=float3(2.0*encodedNormal.a-1.0,2.0*encodedNormal.g-1.0,0.0);localCoords.z=sqrt(1.0-dot(localCoords,localCoords));// approximation without sqrt:  localCoords.z =// 1.0 - 0.5 * dot(localCoords, localCoords);float3x3local2WorldTranspose=float3x3(input.tangentWorld,input.binormalWorld,input.normalWorld);float3normalDirection=normalize(mul(localCoords,local2WorldTranspose));float3viewDirection=normalize(_WorldSpaceCameraPos-input.posWorld.xyz);float3lightDirection;floatattenuation;if(0.0==_WorldSpaceLightPos0.w)// directional light?{attenuation=1.0;// no attenuationlightDirection=normalize(_WorldSpaceLightPos0.xyz);}else// point or spot light{float3vertexToLightSource=_WorldSpaceLightPos0.xyz-input.posWorld.xyz;floatdistance=length(vertexToLightSource);attenuation=1.0/distance;// linear attenuationlightDirection=normalize(vertexToLightSource);}float3diffuseReflection=attenuation*_LightColor0.rgb*_Color.rgb*max(0.0,dot(normalDirection,lightDirection));float3specularReflection;if(dot(normalDirection,lightDirection)<0.0)// light source on the wrong side?{specularReflection=float3(0.0,0.0,0.0);// no specular reflection}else// light source on the right side{specularReflection=attenuation*_LightColor0.rgb*_SpecColor.rgb*pow(max(0.0,dot(reflect(-lightDirection,normalDirection),viewDirection)),_Shininess);}returnfloat4(diffuseReflection+specularReflection,1.0);}ENDCGSubShader{Pass{Tags{"LightMode"="ForwardBase"}// pass for ambient light and first light sourceCGPROGRAM#pragma vertex vert#pragma fragment fragWithAmbient// the functions are defined in the CGINCLUDE partENDCG}Pass{Tags{"LightMode"="ForwardAdd"}// pass for additional light sourcesBlendOneOne// additive blendingCGPROGRAM#pragma vertex vert#pragma fragment fragWithoutAmbient// the functions are defined in the CGINCLUDE partENDCG}}}

Note that we have used the tiling and offset uniform_BumpMap_ST as explained in theSection “Textured Spheres” since this option is often particularly useful for bump maps.

Summary

[edit |edit source]

Congratulations! You finished this tutorial! We have look at:

  • How human perception of shapes often relies on lighting.
  • What normal mapping is.
  • How Unity encodes normal maps.
  • How a fragment shader can decode Unity's normal maps and use them to per-pixel lighting.

Further reading

[edit |edit source]

If you still want to know more

<Cg Programming/Unity

Unless stated otherwise, all example source code on this page is granted to the public domain.
Retrieved from "https://en.wikibooks.org/w/index.php?title=Cg_Programming/Unity/Lighting_of_Bumpy_Surfaces&oldid=4109792"
Category:

[8]ページ先頭

©2009-2025 Movatter.jp