Shading a model using an environment map.

I’m attempting to simulate environment lighting without using environment lighting, but with an environment map as a texture on an object. First off, my environment map is blurred to a degree that it shouldn’t look like an image of the environment, but simply represent the lighting of the environment.

For example, if a face point up (Z+), than that ENTIRE face (unless smooth is set) should be shaded one color, the color of the pixel representing straight up in the environment map. This is regardless of camera angle, so this is not a reflection, but a shading method.

Upon knowing what I wanted, I assumed the faces’ normal vectors would decide what color each face is shaded. So I decided to map the environment map to normal coordinates. However, this did not work. At first, it appeared to work properly (one side of a cube was shaded a single color rather than many colors, etc) however when the camera moved, the texture mapping changed. So, apparently the normal coordinates depend on the camera angle.

Now I’m stumped how to do this. What do I need to use?

I’m doing this because I’m planning to eventually create an extremely advanced lighting system in a game I’m making, which would depend on using environment maps like this in order to light my character, rather than lamps, to give off an effect similar to GI in game. I’m going to use the environment mapping example I found HERE.

For example, this is what I have now:

This image has no environment lighting, no ambient occlusion, no lighting whatsoever, just the envmap texture and the material set to shadeless. The lighting looks great, it’s just that its orientation of the lighting is unfortunately set based on the camera angle, and I don’t know how to fix that. The lighting’s orientation should be fixed and directly relative to the angmap that was used to generate the envmap.

blender reads env maps as a 3 x 2 tile, each square of the tile representing either front, back, top, bottom, left or right. I am not sure in what order though. Somewhere I think there is a script that will do it.

Yes I know how envmaps work, and I know about the script (which I linked in my first post) but I still can’t get them to render the way I want them to. Normally they’re used with reflection coordinates, which makes them easily simulate reflections, but that’s not what I want. I want the envmap to shade my model based on each face’s normal vector. However, when I set it to do that, the camera’s angle still seems to affect that value. While it would work perfectly if my camera was set in one exact direction, it doesn’t work in any other way. The envmap should be shading the model so that the faces appear the same colors regardless of which direction you’re looking at them from.