I just want to chime in and say I also really need editing normals. I can use the transfer normals add on, but it still won’t let me export those normals in any way. Being able to cut environments into chunks is a very common way to deal with large environments to ensure only certain areas are loaded up which really helps with keep a game running well and if I do it without editing normals it will cause lighting seams in the mesh. The same happens with modular models like walls, scifi hallways, etc.
Then on top of that it’s also important for foliage, grass and certain more fluffy hair styles for certain visual styles.
Fyi, Split Normals Stage I is in master since yesterday. It adds option to auto-compute split normals (normals per vertex per face) in Blender, and use them to in 3DView and renderers (BI and Cycles), which basically means you need no more the Edge Split modifiers in most cases (exporting them was already implemented in some exporters since 2.69).
Some level of normal editing (as well as no more loosing normals when importing from other formats) is next step (probably through modifiers in a first time). Main difficulty here is that we need some kind of “diff” storing of custom normals, so that they remain meaningful when geometry is deformed.
Hey mont29. So far I like the edge split additions and the import and I can get most good results using the transfer normals addon someone made a while ago.
To me the biggest thing holding me back personally when it comes to needing custom normals is actually exporting models with the custom normal information. If you plan to use Blender for models for any external programs or game engines it can be important for certain cases to have this feature. Using custom normals in internal renders simply isn’t as big of a concern I think and when it comes to lots of people asking for the custom normal features, I think it’s many times people working on games rather than those doing renders with BI or Cycles.
So what is the status of being able to edit vertex normals in Blender? FYI, I use Blender in professional 3D game developement and I cannot use the split edge modifier on my models because it is inefficient and causes a lot more edges and verts in the final model. I had to obtain a licensed copy of Maya just so I can import my Blender FBX, set the normals to face and then specify hard and soft edges and re-export the model to get a lean, optimized and correctly looking model. I can’t tell you how horrible it is to have to rely on an expensive piece of software like Maya just to do something that Blender should, in my opinion, be able to do. I am praying that one day, Blender will be able to compete in this area but I hear so few complaints about the problem, as if there is no problem at all, that I don’t have a lot of hope it will be fixed.
I too would like to know what the status of this is. It’s the last big hurdle I have to figure out before I can use Blender for our games pipeline at my indie game dev studio. Anything we can do to help? Pizza, Beer? ^.^
The real question: When will the loop normals exist at all times regardless of the enabling of automatic edge splitting or not? If an edge is not split the loop normals should still exist, they would just be the same. I have a number of tools that use bmesh that would like to have access to loop normals in addition to face normals and vertex normals. Face normals and vertex normals always exist so why not loop normals? I hope this hack of requiring “Auto Smooth” to use loop normals is not the permanent strategy (loop normals appear to not be used at all, see below).
Just trying to understand what the master plan is here. In the current build I am playing with, loop normals in the object data always show (0.0, 0.0, 0.0) regardless of splitting or not. It’s essentially the edge split modifier, without using the modifier. The loop normals are not even used. I would assume this is why they are calling it “split normals” and not “loop normals” yet?
I should note edge splitting is not my biggest concern. Achieving this simple shading setup is:
I) Loop normals only exists when autosmooth is ON. This won’t change in future (though the option might be rename when we can do more with them), since we do not want loop normals by default - HD meshes do not need them, and they add quite a bit of data and processing.
II) It has nothing to do with edge splitting. Edge splitting makes each face having its own copy of a common edge, historically it was the only way to handle sharp edges.
III) Loop normals, when enabled through autosmooth, are always produced for all faces’ corners (so-called ‘loops’ in Blender slang), and indeed have same values across all faces sharing a same vertex when this vertex is smooth.
IV) Future plans are to add a ‘diff’ layer to splitnormals, which values would be editable (at the very least indirectly through modifiers, later perhaps directly in Edit mode). The idea of storing custom splitnormals as diff against automatically generated ones, instead of simply editing them, is that they would remain (generally) consistent even through deforming geometry modifiers (armature…) or shapekeys.
IV Would also at last allow to import those splitnormals from formats that support them (we can already export them). Unfortunately, FBX is eating nearly all of my dev time currently, so can’t give ETA about that stage II of split normals.
Just to update on the progress of the modifier, I’ve added vertex group support and also changed the search method for finding nearest vertices, now it’s much faster and works a lot better for real time tweaking.
Here’s a quick screenshot of normal edited nose for an anime type effect (using vertex groups to isolate the nose):
Building from source is pretty painless on Windows from my experience, using CMake and Visual Studio Express. I don’t have experience on other platforms, however instructions to build for all platforms are available here: http://wiki.blender.org/index.php/Dev:Doc/Building_Blender
The diff files tell git what changed / what to add.
To skip the hard part, here is a pre-built version for Blender 2.71 Windows 64 bit (note the 2.71, you may need to import your 2.70 preferences):
I have been busy lately so I haven’t worked on this much – I did a quick test and everything seems to work as I expected.
When I try to export the modifier is overwritten and the normals are recalculated as their original vectors. Any suggestions on how to export with changes?
Having two object fields is confusing and there’s added confusion caused by the fact that the enabled option relates to the object reference on the opposite side.
I think it would be more useful to make the ellipsoid option relate to the modifier object bounding box rather than modified object.
I’d like to see/experiment with a version where the split normals can be transferred from the modifier object.
I’d need an explanation on how the vertex group weighting works before I could comment on that, but I personally found the slerping of the version I posted very interesting for naturally flowing normals.
Yep, UI of the modifier is far from definitive, and for need more work (as pretty much everything in that patch ).
Using target’s BBox dimensions can indeed be more useful… Guess with objects like empties, using their scale instead would work (in any case, real dimensions are irrelevant here, only proportions matter).
Weighting with vgroups does use slerp interpolation from custom normal to default (auto) normal (BLI’s interp_v3_v3v3_slerp()), so should behave as ‘natural’ as possible I think.
As for ‘transferring’ of normals, if I understand it correctly, it’s already in my mind, but it’s slightly more complicated than current two modes, so would keep it as ‘candy’ once main background work is finalized (it would involve finding first the closest target vertex, like in current Object mode, but also then the ‘best matching’ polygon for a given source loop, perhaps based on poly normals?).
Anyway, thanks for that first feedback! I plan to get asap a first quick review (on general level) from Campbell, to get sure there is no big issues with current code/design, and then I’ll publish it as a temp branch on Blender’s git repo, so that everyone can follow progress on the review/integration process.