Control Voronoi distribution (cell size) with second texture

Thank you for your insight. I have an ignorant question for you regarding your second point.

Does currenty implemented Worley’s algorithm knows which point is mapped to a surface? I know that it is performing check within 3x3 kernel, but is the calculation performed for the whole domain or only for the parts that are projected onto the mesh?

Currently, the calculation is based on hashing the coordinates of each grid cell.
Here’s a short step by step:
Let’s say we want to sample the point (4.15, 5.65, 7.89) (with the coordinates already scaled by the ‘scale’ factor). That point belongs to the cell grid (4, 5, 7), (floor(x), floor(y), floor(z)). If we hash those coordinates we can produce a pseudo random coordinate xyz that has its components in the range [0, 1[, and if we add this new coordinate to the cell grid’s origin, we get the point in that cell. Then we just calculate the distance from our sampling point to the cell point and store it. We repeat the process for all neighbouring cells, and pick the lowest distance to each point.

Since hashing produces the same output for a specific input, every time a sample needs to calculate the cell point for (4,5,7) it will get the same random coordinate. So there’s no need to store anything, and avoid memory lookups (that are quite expensive in GPUs).

Boy, that’s really efficient design. Now I see why it is so hard to beat it in terms of performance.

And just for the sake of brainstorming, since I’m a theoretician right now:

Then, how about not touching point data at all? Just leave Worleys solution to random distribution of final points intact, and same goes for proximity check at the end.
But dynamically change kernel size and placement istead. The kernel data would needed to be stored and calculated, but it could take much less time than calculation of all points. And the caches for it could be relatively small.

The simplest idea is to divide kernel space not only in grid, but recursive levels:


The hardest part I assume would be implementing cross-checking between kernels of various sizes.

The kernel space data could be calculated on the basis of vertex weight or vertex color. Since verts have xyz coordinates it could be translated to kernel grid easily. And the weight/col value could determine the kernel level (size).

You should try it with OSL to see if it works… But I suspect that dealing with samples that have Level0 but with neighbours with higher levels might be a bit tricky (not to mention that breaks parallelism a bit).

The nodetree i posted above is a bit similar to this, with 4 levels of detail…
Basically the algorithm stays the same but is executed 4 times with different scales.

Just out of curiosity - what map ranges did you use on the node setup?

I’m playing with it right now, and the most interesting effect is when you multiply the voronoi distance by negative values.

Below is roughly the effect I’m after. Think natural moss growing patterns. But right now it’s unusable - this pattern is only working when direcly connected as base color in Principled BSDF. In any other node it gives pure black due to negative value.
Both texture scaling and detail works like a charm. Using it as displacement gives trash thou.

Each one converts a fraction of the input to the [0,1] interval.
Something like -> from[min:0.6, max:0.7] to [min:0.0, max:1.0].

In that case I used the following intervals ([0.6-0.7], [0.5-0.6], [0.4-0.5] and [0.3-0.4])…
Thought these values might vary depending the ‘Control Texture’ ranges (the Noise texture has its values around 0.5, so the fractions are near that).
Also, my ‘Add nodes’ are clamping the values (we only need the distances that are in [0, 1]). So I wouldn’t expect any results while multiplying with negative values.

In you case, I’m not figuring why you get that result (no blender atm)… I might need to check the source code from the MapRange node to see what it does when clamping a negative output (it should be clamped to 0, theoretically). :confused:

Well it might not be true Voronoi, but this approach is dangerously close to naturalistic effect. I’m changing the solution for this topic since your node setup is the fastest and most elastic approach. My apologies @Tarby

That’s what I was after:

4 Likes

No problem at all and glad you found a solution. Thanks for the node tree, too.

1 Like

I really like that clay look!! Very nice. :wink:

1 Like

Yeah that’s kinda cool actually.

1 Like

first post on blender artists so thats fun


this style gives cool moss

close up of nodes

the only other thing i can think of that would apply the real world with random voronoi is alien spots / pimples

1 Like

Transition between different voronoi sizes is one step. Next is to do it between completly different noises/patterns!

If it works that means building biome and natural environment shaders could be automated very easily.
From my understanding the key is to properly interpolate between masks of different sizes and shapes.

2 Likes

looks awesome. love nature stuff

i was playing with a maths equation called Quincunx flip
with it quincuiz flip random voronoi pattern.PNG
and the nodes pretty trippy but cool

2 Likes

creepiest alien suzannes i could come up with with this threads voronoi setup

you know the type of aliens that burst out of your chest in movies :alien:

blend file of 4 quick attempts
alien suzannes.blend (1.7 MB)

Hello!
I know that this is an old thread, but after facing a similar problem and searching far and wide for a solution I thought of maybe post what I came up with in case that somebody else will end up here :slight_smile:

I completely abandoned the Voronoi texture and used the proximity of the point distribution of the geometry nodes in Blender 3.0 to create the (i’d say) exact effect in the reference image. The downside of it is, that your Mesh has to be in a pretty high resolution for that to look smooth. I adressed this problem with a multires modifier.

Since I don’t know what exactly you try to achieve this might not be the way for you, but for baking the texture it should work. Anyway, I hope this somehow helps! :slight_smile:

Thanks.
Yeah, I also figured it out some time ago:

This approach can give good results, but as you said it has downside in a form of dense geometry.
I think Geometry Nodes eventually will have something better. I’m patient, I can wait.

You would need to find a way to fill the volume with points instead of the surface. Or perhaps the bounding box.

Oh yeah. There are definitely new ways of expanding this feature. Pointclouds and attribute fields for example.

How about generating the points on the surface like you did but afterwards moving them along the normal with a random value. Perhaps this would make it smoother even on less dense meshes.