All you never wanted to know about normals.

This will be a short, but detailed tutorial about normals, what they are and what they do.

For someone with a 3D programming background or a math major, surface normals are clearly understood. However, Many 3D programs in the past hid this detail from their users and it cannot be assumed that everyone using Blender knows what they are. Since they are so important to proper modeling and cause so many headaches, I thought it would be a good idea to write this.

I’ll save the math stuff, since Blender does the calculations internally. This tutorial will just explain what they are, what they do and why we should care. However, when I use the term “plane”, I’m talking about the mathematical construct and not the Blender object.

A surface normal is a vector that is perpendicular to a plane as shown:

Each face in blender has a plane equation attached. The normal vector of this plane describes the “direction” of the face. This is used to determine what face is visible and how much lighting a face receives.

To determine if a face is visible and lighting, the angle between the view point (camera) or light is calculated.

If the angle is less than 90°, the face is facing the viewer and possibly visible. If the angle equals 90°, the face is being viewed on edge. If the angle is greater than 90°, the face is facing away from the viewer and not visible. This information is used internally by Blender’s 3D view and not typically needed by the user.

However, where normals tend to mess people up is with the lighting calculations. With lighting, the angle between the light and the normal is calculated as with the view point. How this determine lighting is beyond the scope of this tutorial, but there is one key point to keep in mind. If the angle is greater than 90°, the face is pointing away from the light and not lit. When your model has a face with weird lighting, it is because the normal is pointing the wrong way. Thus, the angle between it and the light is greater than 90° and it is not not lit. This happens due to the way a plane is described (math stuff). Blender typically does a good job in guessing which way the face should be pointing, but it is extremely difficult to calculate without knowing the mathematical equation of the object. As such errors tend to crop up.

Here is an example with a smoothed sphere:

To view the normals, go into edit mode and click the “Draw Normals” in the “Mesh Tools More” panel:

Doing so will give you this:
The short, blue lines are the normals. Notice how they all point away from the sphere.

Now, here is a sphere with bad normals:
This is what you’re probably seeing and why you’re pulling your hair out.

Viewing the normals show the problem:
Notice how the normals in the incorrectly shaded areas are pointing in? This is the problem. The lighting calculations are wrong in these areas and you get a weirdly lit object.

How do you correct this? Simple! Go into edit mode, press ‘A’ to select all of the vertices and then press ctrl+‘N’. This will cause Blender to recalculate the normals so they all point out. This SHOULD fix the problem. If that doesn’t fix it, then there are bigger issues with your model. Namely, you have an internal face (more than two faces sharing the same edge). This will screw up the normal calculation, so it will never look correct. The other possibility is that your model has double vertices. Removing doubles will fix that.

If you were paying attention, you noticed a “Draw VNormals” button in the “Mesh Tools More” panel. This is more detailed than this tutorial needs to get, but these are actually what is used to calculate lighting when you turn on smoothing. They are calculated by averaging the surface normals and are perpendicular to the tangent plane of the surface. It is a good idea to view both, and if your model is to be viewed with smoothing on, you can just look at these.

Are there ever times when you’ll want the normals pointing in? Absolutely. If you want to view the inside of an object by placing the camera inside of it, then you’ll need to recalculate the normals so they face in (ctrl+shift+‘N’). Also, if you are modeling a hallway or room for a game in the game engine, the outside of the model will never be visible, so you’ll want to have the normals facing in.

Hopefully this helps some people. This is a little sparse, but should be sufficient. If you have any questions, please ask.

Bad normals are G-Men!

to see UV mapping you must have the normal out or you wont see the mapping

also if some faces are going inside it is called a non manifod mesh

and you did not say anything about Vertex normal !

also sometime when you extrude things around you may have to Cltr-a the object to get the right normals normal to the surface - it does not happen all the times but when it does you have to do it !

when smoothing does not work with proper rendering you may end up having to use the edges split modifier to get faces with a good normal outward and render

happy blendering

Thank you! I could not remember the correct term

I did, but I did not go into any amount of detail. To fully explain them and how they work, I’d have to go into detail about how lighting is computed, how the vertex normals are interpolated during scan conversion (then I’d have to go into scan conversion…). :eek: It would get messy. If there is a true need for this much detail, I can do this. I just don’t see the need. :no:

I guess the simple answer is that a vertex normal is similar to a surface normal, except it points away from a vertex and is defined by the tangent plane of the surface. It is used in interpolated shading to smooth the polygon.

Good to know.

hey thank you for the good tutorial :slight_smile:

this also may help people

and it will explain here why smooth faces with angel of 90 deg not looking very good
so must use edge split modifier

good job :slight_smile: