Calculating normals

Hi there,

The problem: I don’t understand how blender is calculating the normals of a vertex.
I attached a picture (look at it now :). You see the normals calculated by blender. The red lines were added by me. These are the normals i would think blender should calculate. (The thick line for one vertex - interpolated, the thin lines for two vertices overlapping)

Questions:

  • Is it possible to influence the calculation of the normals? How? If not…
  • Is it possible to edit the normals by myself? How?

Greetings,
Bernhard

Attachments


I think the best way to see Normals is to set your Tranform Orientation to Normal. I am not sure what those blue lines are but the thick red line is pretty close to what it actually would be.

In LightWave for instance, some operations can change how this normal is calculated for translation but this is limited to a specific tool. In Blender Alt S, does try an make an “average normal” calculation for scale. The script Solidify Selection does a better job of the classic inset you see with a bevel operation in other packages. LightWave has a bevel tool called Multishift that has about 4-5 settings for Normal.

But In Blender all standard scale rotate and translate operations use the same normal setting you get with that particular Transform Orientation depending on what is selected of course.

AFAIK, you can not customise the normal setting in Blender, unless by scripting and I don’t know anything about py scripting in Blender.

What do you need acess to the normal information for?

Just curious.

Tybalt,

I’m having a little trouble making out the geometry based on that screenshot since it’s so close in. It might help if you could provide a few screens from further out, and at different angles.

The blue lines are the actual surface and vertex normals as calculated by Blender. You can enable their display in the Mesh Tools More box.

Attachments


Cool. Thanks. I was looking for that but could not find it. Figured that was what they were but forgot where to turn them on. I am having trouble with the image too then. Those normals display are accurate with the Transform Orientation, so amend my statement above. So two votes for a better image. :yes:

Ill provide u with a better screenshot tomorrow.

I want to export the model to .x file for directx. The normals exported too. But with this crappy normals the light isnt calculated correctly :frowning:

Have you tried recalculate outside? (cntl n) on selected geometry. I am not really up on DirectX, so anything else that is going on there might be related to issues with the plugin or script that exports. Visually the normals look normal to me. So I am not sure where the problem might be coming from because I have not really done that before.

Yes, i tried recalculating them. Inside and outside… If i use the inside-recalculating blender just flips them 180 degs.

The current (.x)-export works fine. i viewed the meshes with the ms directx viewer displaying the normals and they looked exactly like in blender. so im sure the export works fine.

Visually the normals look normal to me

My problem is: The red lines in the picture are painted by ->myself<- . This is - of course - the way they should look like.

Recently i found out that the normals change strange if i rotate the object. Try it. I always thought the only way to calculate a normal for a vertex is to cross the adjacent edges of the vertex. But it seems blender also uses the orientation of the object. I provide more pictures tomorrow. I think i will also post this blender file.

OK, got it. Look forward to seeing the pic. You have my curiosity. BTW, I was referring to how they look to me in Blender, they definitely look off in your picture but that is not how they look here. I rotated the object (Both in Object and Edit Mode) and all the normals look fine. Nothing changes for me. Interested to see what you have going. I am probably just not looking for the same thing you are.

So, here are the promised screenshots.

  • normals1.jpg: shows the whole mesh.
  • correctnormals.jpg:this part of the conveyor belt is displayed with correct normals
  • wrongnormals.jpg: this part of the conveyor belt is displayed with INcorrect normals

If i rotate it the normals change its direction slowly.

I would like to see a shot from another user displaying the normals. Best thing would be a sphere. Not a box or sth like this.

Attachments




Hmm, just tested sth. If i create a new scene. The normals of a sphere are displayed correct. If i add a sphere in the conveyor belt scene they are not… So obviously i screwed up some values i didnt understand :rolleyes:.
Time to find out which values ^^

I made serveral models but they all have correct normals. So the problem must lie within this mesh. Attached…

Attachments

rollconveyorrounded_joined.blend (185 KB)

SOLUTION:

I really dont know what it means. But just press Strg + A and select “Scale and Rotate to ObData”. After that the normals should be ok.

Found at http://www.jmonkeyengine.com/jmeforum/index.php?topic=7859.0

SOLUTION:

I really dont know what it means. But when in object mode just press Strg + A and select “Scale and Rotate to ObData”. After that the normals should be ok.

Found at http://www.jmonkeyengine.com/jmeforum/index.php?topic=7859.0

That’s interesting. I am curious as to what caused it. I have never seen anything like that. Glad you were able to figure it out anyway.

Im also curious for this, maybe it leads to a bug report. I will try to replay the modeling of this mesh with an eye on the normals. Maybe i find the reason for this behaviour.

Hi, just found this thread.

I also have a question regarding vertex normals. It seems to me that Blender always calculates vtx Normals when importing a mesh. This is no problem when you are working with quadmodels created in standard tools (Blender, Max, Maya, etc.).
You will get into problems though when working with tesselated (triangle) data formerly created by CAD systems. The shading of these geometries heavily relies on vertex normals.

I never had any luck importing these models into Blender for UVing or rendering.
Does someone know, why Blender is always overwriting imported vtx Normals? Or if there is a way to change that?

Thanks

I haven’t the foggiest idea, but just as a tip, you’re usually better off asking a question in an entirely new thread. Cheers!