Divergence of Normal Vector Field

I’ve been working on a slope-dependent shader recently; early tests are looking good, except that I’d like to be able to assign a certain material to regions of the mesh that protrude locally (think small rocky areas on mountains or bark on a tree trunk). I was figuring the best way to assign this would be to check the divergence of the normal vectors; in regions with high divergence, perhaps determined by some threshold, this material could be assigned.

My question is this: is it at all possible to approximate the partial derivatives found in the definition of the divergence using math/vector nodes to accomplish this? In other words, can I represent the following (the z component would be neglected in this method):

image

Using math/vector nodes? My main issue right now is that the shader appears to work on a face-by-face basis (meaning, it looks at the normal vector of a given face and uses that information to determine what color to apply); is there any way to look at the vector field as a whole (or at least other neighboring vectors) somehow?

Hopefully this comes off with some level of clarity!

what field would you be using ?

I mean you can get the face’s normal
and then what ?

there are other ways to get the Height for mountain function of Z

happy bl

Only if you’re using true normal. Using regular normal (from texture or geometry node) will get you the smoothed normal (if it is smoothed of course).

I’m not sure if I understand this correct, but I don’t think it’s possible to extract “information density” from a pixel that is being shaded. We could do this in other renderers way back for certain aa tricks, but tracing tech was very young back then and I’m not sure even the concept would be applicable today.

If I understand it correctly, it would require knowing what the neighbors are. If you prerender and blur it, you have influence from neighboring pixels. And since we don’t have image filtering (such as blur), I’m guessing it can’t be done. Maybe OSL? Secrop might be able to pull one of his numerous magic tricks out of the OSL bag :smiley:

I’ve seen an attempt here on actual blurring any output (excl the "blurred coord trick which would be useless here), but it looked silly heavy and unusable in a production environment.

Actually not quite!:thinking:
There’s a small limitation in Cycles_OSL (and probably in other engines also,) in refer to derivatives. They work with almost every value present in the shader tree… Except with Normals!! One can get the geometric normal derivatives, but not from the smoothed normals.

Of course, one can always trace() around P to get some values, but for this we already have the AO and Bevel nodes… Unless @SSimpossible has some other equation for dealing with surface curvatures from point data… Then yeah, he can go with the trace option. I think we can now get ‘shaded’ values from trace, but i’m not sure.

@SSimpossible, although that’s a neat formula in 2D, it’s a bit difficult to make it work when you’re using all your vectors in 3D (that’s why there’s an equivalent for 3 and more dimensions)… what is x? or y? we know that they should lie on the surface plane… but where do x and y point to? And are we working in screen space, world space or tangent space?? and even if we use x and y as the UV tangent and co-tangent, all we get is the tangent normal map (we can bake it; same result).

For working in tangent space you can read this BSE thread, where Rich Sedman uses a python script to find curvature maps from tangent normal maps, thought this is very heavy to do inside cycles. It might be usefull, thought. :wink:

Thanks for all your responses! I’m likely just not understanding fundamental concepts about how normals work in Blender (or maybe any 3d software). I guess I should have specified the space, for sure; I was talking about tangent space (but then again, I may be misunderstanding something). In tangent space, wouldn’t the divergence only have the two components (assuming the mesh can then be treated as a surface whose height ‘z’ is defined by two independent coordinates, ‘x’ and ‘y’, which would then be locally defined)?

Basically, I want to set something up that acts kind of like the ‘pointiness’ feature for the geometry node, but with a bit more control. I was thinking that if a mesh could be represented as a vector field - through its normal vectors (as in, the ones shown in the image below) - I could approximate its divergence at every point, and assign a certain shader to points with a divergence higher than a certain threshold. Of course, exact partial derivatives couldn’t be used since this vector field would clearly be discrete and not continuous, but I would assume that I could get a rough approximation using central difference formulas, or something.

Ultimately, I’m wondering this: is it possible to work with the normal vectors of the mesh as a whole vector field when assigning shaders, rather than vector by vector (which appears to be how it works)? Or am I still just misunderstanding the entire shading process?

Thanks again for your responses (and patience)!

normals

edit: My bad, yes, the vector field would indeed have 3 components (I was switching to a surface in my mind, which would only have two independent variables x and y; its normal vector field, however, would be dependent on x, y, and z as Secrop pointed out). Though I’m not sure how drastically that would change the process other than adding one more term to the divergence.

Geometry > Pointiness in Cycles nodes could be used on edges > your_threshold_angle_value. It would be unreliable on its own to discern and mask rock from ground for example. I bring it up to save reinventing a faulty wheel for your use case.

Edit: I see you’ve already mentioned pointiness. My bad. Consider the case where a rock boundary protrudes at an angle > your_threshold but is rounded (ie < your_threshold) on top but is still a rocky surface. Would your algo / implementation be able to deal with that?

A naïve answer would be ‘no.’; But there’s ways to get something done… Baking data into textures for example, or using vertex-color layers (be aware that cycles store these values with color transformations).

ps.I’ll come back to this thread later as this is a topic I’ve been interested for quite a while.

how about these kind of maps ?

Stress mapping cycles

Tension map addon

but as mentioned before forget about the directional derivative in blender!
too complicated
might be possible using some Python advance module
or sci py !

have fun

happy bl

LoboTommy, I was thinking that there could also be a radius value for testing the divergence, such that the size of the protrusion could be taken into account (i.e. for a small radius, large rocky outcrops with angles on top < threshold may still be unaffected).

But it sounds like this may just remain hypothetical, based on all your responses. I’ll keep looking into it I suppose, but my limited knowledge of OSL and normals puts a bit of a constraint on my progress, haha.