The problem: I don’t understand how blender is calculating the normals of a vertex.
I attached a picture (look at it now :). You see the normals calculated by blender. The red lines were added by me. These are the normals i would think blender should calculate. (The thick line for one vertex - interpolated, the thin lines for two vertices overlapping)
Questions:
Is it possible to influence the calculation of the normals? How? If not…
Is it possible to edit the normals by myself? How?
I think the best way to see Normals is to set your Tranform Orientation to Normal. I am not sure what those blue lines are but the thick red line is pretty close to what it actually would be.
In LightWave for instance, some operations can change how this normal is calculated for translation but this is limited to a specific tool. In Blender Alt S, does try an make an “average normal” calculation for scale. The script Solidify Selection does a better job of the classic inset you see with a bevel operation in other packages. LightWave has a bevel tool called Multishift that has about 4-5 settings for Normal.
But In Blender all standard scale rotate and translate operations use the same normal setting you get with that particular Transform Orientation depending on what is selected of course.
AFAIK, you can not customise the normal setting in Blender, unless by scripting and I don’t know anything about py scripting in Blender.
What do you need acess to the normal information for?
I’m having a little trouble making out the geometry based on that screenshot since it’s so close in. It might help if you could provide a few screens from further out, and at different angles.
The blue lines are the actual surface and vertex normals as calculated by Blender. You can enable their display in the Mesh Tools More box.
Cool. Thanks. I was looking for that but could not find it. Figured that was what they were but forgot where to turn them on. I am having trouble with the image too then. Those normals display are accurate with the Transform Orientation, so amend my statement above. So two votes for a better image. :yes:
Have you tried recalculate outside? (cntl n) on selected geometry. I am not really up on DirectX, so anything else that is going on there might be related to issues with the plugin or script that exports. Visually the normals look normal to me. So I am not sure where the problem might be coming from because I have not really done that before.
Yes, i tried recalculating them. Inside and outside… If i use the inside-recalculating blender just flips them 180 degs.
The current (.x)-export works fine. i viewed the meshes with the ms directx viewer displaying the normals and they looked exactly like in blender. so im sure the export works fine.
Visually the normals look normal to me
My problem is: The red lines in the picture are painted by ->myself<- . This is - of course - the way they should look like.
Recently i found out that the normals change strange if i rotate the object. Try it. I always thought the only way to calculate a normal for a vertex is to cross the adjacent edges of the vertex. But it seems blender also uses the orientation of the object. I provide more pictures tomorrow. I think i will also post this blender file.
OK, got it. Look forward to seeing the pic. You have my curiosity. BTW, I was referring to how they look to me in Blender, they definitely look off in your picture but that is not how they look here. I rotated the object (Both in Object and Edit Mode) and all the normals look fine. Nothing changes for me. Interested to see what you have going. I am probably just not looking for the same thing you are.
Hmm, just tested sth. If i create a new scene. The normals of a sphere are displayed correct. If i add a sphere in the conveyor belt scene they are not… So obviously i screwed up some values i didnt understand :rolleyes:.
Time to find out which values ^^
I made serveral models but they all have correct normals. So the problem must lie within this mesh. Attached…
I really dont know what it means. But when in object mode just press Strg + A and select “Scale and Rotate to ObData”. After that the normals should be ok.
Im also curious for this, maybe it leads to a bug report. I will try to replay the modeling of this mesh with an eye on the normals. Maybe i find the reason for this behaviour.
I also have a question regarding vertex normals. It seems to me that Blender always calculates vtx Normals when importing a mesh. This is no problem when you are working with quadmodels created in standard tools (Blender, Max, Maya, etc.).
You will get into problems though when working with tesselated (triangle) data formerly created by CAD systems. The shading of these geometries heavily relies on vertex normals.
I never had any luck importing these models into Blender for UVing or rendering.
Does someone know, why Blender is always overwriting imported vtx Normals? Or if there is a way to change that?