Voronoi Cells gradient or pattern

Can’t figure this one out.

I want to take the voronoi cells procedural - and have either a randomly rotated gradient or a random pattern within each cell.

It’s easy to overlay a pattern over the voronoi cells - however this pattern is continuous between cells. I would like my pattern/gradient to be discontinuous.

Can this be done?

Sometimes you can feed one texture into the vector input of another texture to distort it. Maybe its possible with voronoi?

Perhaps something like this?


Using a similar approach as my world2tangent setup,:


this is suppose to work with 2d textures, but using the cell.value as the Z coordinate can easily be used in procedurals. The object should be unwrapped, for the tangent vector (using the generated tangent will rotate the resulting vector around the Z axis of the object).

1 Like

Thanks - i’ll take a look.

Oooh nice solution!

Worked great - thanks.

Only problem is (and I don’t know if its because of the complexity of the node setup) - but it doesn’t seem to work with microdisplacement.

I had intended to feed the result of this into the gradient texture and drive the microdisplacement in Blender 2.8RC2. Although the node setup does create the correct result when plugged into the colour slot of say a diffuse shader - when I feed the result into the displacement slot - it does nothing.

It’s strange - I would have expected the displacement to simply take the greyscale texture and apply the displacement - but it doesn’t work, even when run through a math node to multiply it.

This is something I had noticed before. It’s quite odd because it seems to be outputting a numerical range but I also couldn’t get that output to act as a displacement whatever I did to it with ramps, maths nodes etc. yet you can do the same with a noise and it displaces fine.

You might have to scale the output before using it as displacement. I’ve tried manual linstep/smoothstep (math nodes) with success, but not a color ramp yet.

I’ve cross with similar situations before (at least by using the bump node to find derivatives to use them later). It turns out that there’s some limitations on how the SVM deal with this kind of logic, and the only way to pass over it is to bake, or do it through OSL (in the particular case from this thread, OSL is the best option) .

What does SVM mean? Any more info on this limitation?

Anyhow, I tested this and I have to concur with 3pointEdit - nice solution. Although I didn’t quite figure out what the original problem was :slight_smile: It’s nice and brilliant, but I can’t wrap my head around what is actually going on. I’ve setup a completely different way, if the idea was to scramble the UVs up based on colored output from voronoi. I’m getting 6 “different” (enough, 3 last ones is just a hsv modifier using the same color output) outputs plugged into a custom mapping node, since the vanilla mapping node doesn’t allow anything to be plugged in. See Bartek Skorupas tutorial on how to setup your own (nice utility and I use it or parts of it constantly).

You may not need all of it, or need to scale the used values differently depending on your needs, but the flow of it should be easy enough to follow:



You can also take the fac output of the voronoi and separate effects using a series of greater than and less than nodes and just add up the effects in the end.

SVM states for ‘Shader Virtual Machine’, and it’s the part of Cycles that deals with shaders.

The limitation in just because when we use the bump node, Cycles (or the svm), probes the texture at least two more times to get the derivatives at the coordinate being sampled. What I suspect it does, is that it changes P a bit and sample the texture again… (at least I know that if you have an OSL texture and want to use it with a bump node, you need to use P from the geometry node and not P accessible from the OSL::Globals). This limits things a bit when we want to hack the bump node to produce other things than normal vectors. (like producing coordinates from the normal, as in the node setup I posted).

What’s going on is just a projection of the bumped normal vector into the tangent plane of the surface, and use the coordinate of that projection as a UV map. We can imagine the bumped normal is a unit vector placed on the surface, with it’s own inclination; if we Dot(Normal, Tangent) we get the cosine of the angle between the normal and the tangent vector; and we do the same to the cotangent, but because we don’t have a cotangent, we need to build one with Cross(Original Normal, Tangent) which returns a new vector perpendicular to both input vectors. We can use these cosines as U and V and use them as coordinates for other textures. I added 0.5 to them just to keep the origin of the coordinates in a corner, otherwise it would stay in the center of each voronoi cell)

1 Like

Ok, thanks for the explanation. Man I wish I could stay awake during math class back in the day :slight_smile: