Vertex normals

Hi,

Can anybody tell me what the relation is between vertex normals and face normals?

Vertex normals seem to be calculated automatically. Is that correct?

My actual problem is this:
I have a 3D model, defined by all its triangles. Each triangle is defined by 3 vertices (3 xyz coordinates), and the vertex normal in each of these vertices.
The vertex normals allow for smooth rendering of round surfaces, even with little triangles.

I’m trying to import this modelin Blender with a python script. The model get’s imported nicely, but the surfaces that have to be smooth aren’t.

So I wrote another script which retrieves the mesh, re-reads the triangle file and sets the vertex normals.

If i run this 2nd script and then click on ‘set smooth’, the surfaces are rendered correctly.

However, if I e.g. go in and back out of edit mode, the vertex normals seem to be recalculated and my triangles are rendered flat again.

Is there a way to refrain from having these vertices recalculated?

Thanks for advice!

Joey

I think the problem is that none of your vertices (between faces) are joined after you load the mesh - try moving any of the vertices and they don’t stay together. You don’t need the second script. After loading with the first script, press remove doubles and this will join up all your vertices then set smooth works. You can maybe automatically remove doubles in your script but I don’t know what the python code is.

Well indeed, the ‘set smooth’ works after I delete the extra vertices.

But it does’t look good at all…

Are vertex normals always calculated automatically from surface normals? Or is there a way to set them and not have them recalculated?

I’m not sure what you mean when you say it doesn’t look good because I’m not entirely sure what it’s supposed to be :). It sort of looks like a clam shell. If you turn on subdivisions level 2, it looks a bit more like a shell.

If it’s because it’s smoothing the perimeter of the shell and you don’t want that, you can set the top faces to be solid. Or, if you have subdiv on, you can set a crease factor for certain edges - that’s a new feature in 2.34.

As for vertex normals, they are usually calculated by averaging the adjacent face normals because that is how they work best. Likewise surface normals are defined as the vector perpendicular to a face. I think you can change them (since there are python functions to let you) but you shouldn’t because surface normals (vertex or face) are used for lighting calculations so if you move them, the lighting gets messed up. Also, I think that Blender quite rightly recalculates them during editing. In fact, if you remove the python code in your script that sets the vertex normals, Blender still shades your model the same.

Surface smoothness/lighting values are calculated by how much interpolation happens between normals - thus you never change the normals themselves, you just change the interpolation. You do this by the methods I described i.e. set solid, set smooth or geometrically with subdiv crease values.

If you tell me what you are trying to model, I can give you better hints as to how to make it look like what you want but I assure you, adjusting vertex normals manually is never done and for that reason, there isn’t an easy way to do it in Blender or any other 3d app. Adjusting them might be useful in gaming if you write your own game engine but not in 3D apps. Even in games, you would still do the same as Blender.

Check here for more info on vertex normals:

http://www.flipcode.com/tutorials/tut_vnormals.shtml

First of all, thanks for the replies!

It’s not a clam shell though. It’s the base plate of a Kuka 361 robot. It’s supposed to be a cylinder, not a clam shell :slight_smile:

A bit more background:
I’m a PhD student, doing research in robotics, and I would like to use Blender eg. for visualisation of my simulation results, before I try everything on the real robot.

Until now, I have been doing this visualisation in my own opengl software, but that is too cumbersome (e.g. I have to recompile to add objects).

I have a VRML model of the robot I use, wich renders just fine in a vrml viewer. In this vrml file, the model is defined by triangles, quads and polygons with more than 4 vertices, and by the normals in each of the vertices (so that eg. the side of a cylinder gets rendered smooth, even if consisting of only little faces).

I don’t succeed in opening this vrml file in Blender, as far as I know because it contains polygons with more than 4 vertices.
I don’t know a way to triangulize vrml files.

However, a while ago I converted the vrml file to a povray file with vrml2pov. In that povray file, the mesh is triangulized (so basically you have triangle vertices and the normal in each vertex).
This mapped perfectly on opengl, so I could render the model nicely.

But now I would like to use Blender, so I wrote the python script, which does the same als my opengl code: defining triangles and defining the vertex normals. But this is clearly not the way to go.

Maybe i should just stick with the flat shaded model…

Unless you’d have an idea…

Thanks and greetings,

Joey

Oh, that was going to be my second guess :wink:

stuuuudeeeent :smiley:

Yeah, I would advise sticking with Blender.

I did the conversion to povray but there’s not much point as POVRAy’s more of a render format so few converters convert from it.

That’s right, Blender doesn’t support ngons so it won’t load. I wish it would load them and then autotessellate them or something.

I sent an email to you (I hope) that contains a Blender scene of the robot. I discovered that Maya has a wrl2ma that converts vrml2 files into maya ascii and because Maya supports ngons, it was able to load it. I then tessellated this and exported as obj and imported into Blender. Don’t use the file I sent yet, I have cleaned up the tessellation and set certain parts smooth so I’ll send that later.