[Originally from onemanmmo.com.]
I've spent quite a bit of time the last couple weeks piecing bits of different articles together to figure out how to implement normal mapping in the Lair Engine. The problem I encountered is that there is a huge variation in the methods and terminology used in articles on normal mapping. This makes it a very confusing topic for nonmathlovers like myself. So here I'm going to explain the three common techniques for normal mapping for the mathematically uninclined.
I've talked about the setup of my renderer before, so if you want to compare it to what you're doing, read this.
Originally I started by implementing Normal Mapping Without Precomputed Tangents which sounded like the easiest way to go, but was probably not actually the best place to start. I wasn't able to get that working. Even with some help from the author of the paper, no matter what I did it appeared that the light would rotate when I rotated the model.
After a couple days of debugging frustration I moved on to other things for a couple of weeks, reading articles on normal mapping in my spare time. When I thought I had an understanding of it, I integrated this code to calculate tangent vectors for an arbitrary mesh as well as shader code to do normal mapping in View Space (which by general consensus is the best coordinate space to do normal mapping.) To my disappointment, this implementation had the exact same problem as the original one: the lighting rotated with the model. At this point I was a little baffled so I put a question to stackoverflow.com which was kind of frustrating, but I did get two helpful things from there: 1) Nobody pointed out any issues with my math and 2) the suggestion to implement normal mapping in World Space so you can render the different elements to the screen during debugging and have a hope of figuring out what's going on.
When you do 3D rendering, there are a lot of different coordinate spaces you have to deal with. I'm only going to explain the first one here, so if you aren't sure what the others are, go look them up. With normal mapping, the chain of transformations looks like:
(You might hear View Space called Eye Space or Camera Space. Some people call Clip Space Homogeneous Space. So confusing.)
Tangent Space is the one we are interested in today. It is the coordinate space that the normals in a normal map are in.
Any article you read about normal mapping will talk about the TBN matrix. It's so named because of the elements that make it up, the Tangent, Bitangent and Normal vectors. (Some people call the Bitangent the Binormal  don't do that.) What the TBN matrix does is allow you to convert normals from the normal map (in Tangent Space) to Model Space. That's all it does.
To build a TBN matrix from the normal of a face and the tangent as calculated with this code, you need GLSL code something like this:
The normal you've seen a million times before, it is a vector perpendicular to the face in Model Space. tangent points along the positive U texture coordinate axis for the face. To calculate the bitangent, we take the cross product of the normal and tangent vectors then multiply it by a constant in tangent.w which is the handedness of the tangent space. The bitangent points along the V texture coordinate axis of the face. I made a texture mapped cube to debug the normal mapping, and on it, the TBN vectors looked like:
This TBN matrix isn't particularly helpful for us now, because we are trying to do lighting and all this does is convert normals from tangent space to model space. To make it more useful, lets build it like this:
By multiplying each vector by the model matrix we get a TBN which converts from Tangent Space to World Space. You'll notice that when we make a vec4 out of the normal, tangent and bitangent we use 0.0 for the w value, not 1.0 which you usually see. The reason for this is to eliminate any translation which might be present in the model matrix, since it doesn't make sense to translate direction vectors.
This brings us naturally to:
The idea with world space normal mapping is to convert a normal taken from the normal map and the direction vector to the light source into World Space so that we can take a dot product of the normal map normal and the light direction to get the magnitude of the diffusely reflected light or lambert value. You can read about why it has this name on WikiPedia.
So, in the full World Space Normal Mapping example code below I generate a TBN matrix which converts from Tangent Space to World Space, plus a lightDirection vector in World Space in the fragment shader. In the vertex shader I use the TBN matrix to convert the normal from the normal map from Tangent Space to World Space and dot it with the normalized lightDirection also in World Space.
You may have noticed this
in the fragment shader. Since the data in the normal map has to be stored in the range [0.0  1.0], we need to rescale it back to its original [1.0  1.0] range. If we are confident about the accuracy of our data, we could get rid of the normalize here.
In all of the example code here I only generate the lambert element of the full ADS (Ambient/Diffuse/Specular) lighting pipeline. I then render that value as a greyscale so that you can see what the contribution of the normal map would be to the final image. I'm hoping by the time I'm done you'll understand the lambert term well enough to plug it into the full ADS lighting setup yourself. viewMatrix, modelMatrix, normalMatrix, modelViewMatrix etc. are all equivalent to the deprecated OpenGL GL_NormalMatrix, GL_ModelViewProjectionMatrix etc.
So the great thing about working entirely in world space is that you can check the tangent, bitangent and normal values by passing them to the fragment shader and rendering them to the screen and then looking at what color they are. If a face ends up red, the vector you are rendering to it points along the positive X axis. (You get green for the Y axis and blue for the Z axis.) You can also do the same thing for the normal map values. Note that these vectors can also have negative values, so unless you rescale them back into the [0.0  1.0] range, you'll see some black polygons.
I built a cube model then spent most of an afternoon checking every single input into the shader to try to figure out why the light was turning with the camera. In the end I figured out that that wasn't really what was happening at all.
If you search normal map on Google Images you'll see a lot of powderblue images. These are tangent space normal maps. The reason they're blueish is that the up vector for a normal map is on the positive Z axis which is stored in the blue channel of a bitmap. So to view a tangent space bitmap, flat is powder blue, normals pointing upwards are cyan and ones pointing downward are magenta.
There isn't any sort of standardization for normal maps, so some will only have two channels of data (it is up to you to reproduce the third yourself but you can use the other channel for some other rendering info, like a specular map), some have the normals oriented in the opposite direction vertically, so look very closely at the normal maps you use to make sure they are in the format you expect. There are also bump maps which are often green, those are completely different, don't try to use those with this.
Once I had a handle on worldspace normal mapping and had figured out my problem I was ready to give View Space normal mapping a try. The idea here is the same as World Space normal mapping except that this time we convert the vectors to View Space instead. The reason for doing this is that you can shift more work to the vertex shader and simplify some calculations as well.
So let's calculate our TBN again:
This one is a little different. First we multiply by the normalMatrix to convert the normal, tangent and bitangent to View Space. Since the normalMatrix is already 3x3 we don't need to use the , 0.0 trick we did in World Space. Next we make a mat3 out of t, b, and n but this time we do a transpose on it. The transpose reverses the action of the TBN matrix so instead of converting from Tangent Space to View Space it now converts from View Space to Tangent Space. There's a math reason why this works in this case. This trick does not work on all matrices.
What we do with that backwards TBN matrix is convert the direction to the light source vector from World Space to View Space and then use the TBN matrix to convert it back to Tangent Space.
Tricky! Now our lightDirection vector is in Tangent Space, the same space as our normal map vectors.
Now you'll notice that the TBN construction code above is commented out in the shader below. That's because there's a mathy way to make this calculation a little simpler:
So with our tricky lightDirection vector in Tangent Space the fragment shader is supersimple and fast.
Once I had View Space normal mapping working, it was no effort at all to get the precomputedtangentfree normal mapping working. This does normal mapping in World Space like the first example, but it computes the Tangent and Bitangent in the fragment shader. I can't notice any significant visual difference between the result of the precomputed tangents, but your mileage may vary. You might want to look at the original article and its comments if you are thinking about using this.
There is a lot more GPU calculation in the precomputedtangentfree implementation, but you save transferring a 12byte vertex attribute to the GPU, so which one you choose really depends on your platform and other rendering load. Apparently on some mobile platforms the precomputedtangentfree implementation is significantly slower. I'm going to continue calculating the tangent offline and passing it as a vertex attribute to the vertex shader because some of my other shaders already put quite a heavy load on the GPU. I'm keeping this implementation though for the case where I have a model with a small normalmapped element and the rest is not normal mapped, as I don't currently support enabling precomputed tangent generation on a permaterial basis.
I found the solution to my rotating light problem when I replaced the code to get the surface normal from the map with
All of a sudden I started getting flat shaded lighting which looked as expected. The light was no longer rotating.
The normal maps I was using were PNG files. The first I made in GIMP for debugging and it was just a completely flat surface which should give me flat shading if it is working correctly. The second was of a carving that I downloaded from the internet as a sanity check. It turns out both images had the same problem! I've written before about how the Lair Engine is Gamma Correct. Well, both images had a gamma value of 2.2 stored in their PNG files but the data in the files was actually gamma 1.0. When OpenGL transferred the normal maps to the video card it automatically converted them from SRGB space to linear space, thus mangling all the normals contained within. This isn't the first time I've run into this issue with PNG files, so it was time to make a tool. I wrote a little utility to load a PNG, change the gamma value without modifying the data, then write out a new PNG.
Here's the testcube with the normal map I downloaded from the web mapped onto it. The light is above and to the right of the camera.
Kelly Kleider 
26 Nov 2013 at 7:53 am PST

Don't take this as a jerky challenge, but your article seems to have a lot of math to be for the "uninclined".
Maybe uninitiated? I don't know if you have seen this: http://www.bencloward.com/tutorials_normal_maps1.shtml It isn't really a functional programming approach, but it is a great description from a techart/art perspective. In any case, nice writeup even if it was mathy. ;) 




Terry Matthes 
Great breakdown. I booked marked this to share with a few friends of mine :)



nicholas ralabate 
i don't understand the "mystery solved" part at all... can you explain what was wrong with the lengyel's method and how hardcoding your surface normal fixed it?



Sandor Domokos 
"mystery solved" part... This very interested in me too, please explain if it is possible. Thank You.


