Blender based "substance designer"?

Thanks guys for lots of interesting links. About slope blur in Substance designer. Again it’s not actually a blur. Its warp or basically UV shift/ 2d displacement in direction of grayscale “slope” from bright pixels of second input down to dark one.
in a series (a loop) of small such shifts fading each after another .

I am only wonder how to get this down slope vector in Blender shader nodes . The rest is easy. My guess it should be same as turning grayscale height into a normal map

it’s super helpful to make all sort of erosion , a core of everything in substance Designer probably . Works like a sediment sliding down from a mountain top.

1 Like

Seems like a useful thing. So it moves the lighter part of the image in the direction of the gradient in steps and then blends them, right? This could be very inefficient with manipulating texture coordinates with noises in shader nodes, and not so practical with repeating the image in Blender though… I think it could be possible to make stepped noises… Might not be necessary, since steps are not needed but it might help with thinking about the problem.

It seems it’s extremely inefficient…

EEVEE doesn’t like the idea at all:
image

It’s better to say that it’s like a procedural smear tool in photoshop, and it uses another black and white texture to determine how much it smears edges.

Like this is a slope blur applied to a 2x2 grid of squares with an anisotropic noise filter driving the, er, sloping.

…and I have no idea how this effect could be replicated in Blender. I image it would be possible, but it’d take a nice collection of math nodes to achieve.

edit: I just found this:

1 Like

This might be one of the reasons people prefer Substance for this. This does not seem practical in Blender even if it turns out to be possible.

1 Like

Now that I have a good idea of what it’s doing, it may not be all that difficult. I could probably make a custom node that does something similar.

Since it’s basically a smeared warp, you could start with this, then build it up.

“Slope blur” is a misleading name actually.

Here what it actually do . it shifts/2d displace pixels down the input slope . The cone/sphere here. What they call samples is actually a number of such shifts and intensity is distance of those shifts .

My guess 2d displacement of pixels is not issue, even done number of times .

In it’s core it’s just a warp node , not directional one where whole picture shifts in single vector but the one where vectors are calculated locally for each pixel by sort of kernel math or something ( not perfectly sure)

It would work just fine if we input normal map my guess done from this cone but then we need height to normal node. Shouldn’t such node be perfectly possible ?
I just don’t know how.

ps. Looks like Blender have such “value to normal” node in its Texture node editor but I never could understood what this editor for and how does it work . Basically zero tutorials in the web about it .

…proceeds to show two videos of someone doing it the way Martin said :sweat_smile:

The fact is, to replicate substance designers functions in Blender you need to look at the compositor, as SD is a pixel based program. Its power comes from this fact, and material maker is a decently powerful open source alternative.
I do hope Blender one day has a way for us to use shaders and compositing together for this purpose, in a streamlined way.

This isn’t what the shader editor does. This is like asking edit mode in a mesh to play audio.
Compositing is what you’re after.

1 Like

No, it’s like asking for compression to be put on the snare microphone, instead of the whole audio track.

We don’t even need a metaphor for this. I cannot blur a bitmap image or a procedural being used for an alpha channel in a material, in compositing.

There might be a point there. :smiley: Procedural textures in Blender are not bitmap images however. It’s very easy to make an add-on using only Python that could do that with bitmaps but I don’t think it would be any good if it worked for bitmap images only… I think the devs would have to bring texture nodes to life somehow for this to work. There are technical reasons why they are ignoring these very loud and plentiful requests for years. This is not something very straightforward to implement because of the way Blender and shader node procedural textures work. It’s a lot of work, or it would be done already. I mean… I suspect this to be so. I could be wrong.

1 Like

@Michael_Knubben is right.

@thorn I think you are missing the point that UV space is not XYZ space. Image texture can be written and read in UV space, but in XYZ space it can be only be read (during rendering) and if you want to save information to it in XYZ space then you are basically doing baking. 2D programs in which blur is so common are operating in “UV space” only.

During read operation in XYZ space the pixel you see is a product of sampling specific point but neither this point or the sampling algorithm has any knowledge of the surrounding points or data. And you need that to perform blur as it uses kernel that is sampling neighboring pixels - the bigger the blur radius the more pixels you need to sample. That is the reason why it can be expensive both in 2D and in poor-man’s substitution made with noise texture.

I’d also like to see some form of data transfer from XYZ → UV preferably in real time.
I think the closest thing that might work is some form of real time baking made with Geometry Nodes where you could use unwrapped mesh and write attribute data not to mesh point but to texels directly. That would be very cool.

Back to blur node - I think the easiest thing that can materialize is blur node that works only on Image textures and is performed in UV space only. But then you will hit problems with blurring across UV island boundaries or perhaps some other distortions related to mapping the texture. And procedural textures would not work with this blur as they don’t exist in UV space.

1 Like

Really?

You might want to consider Armor Paint. It’s based on Blender and is compatible with Blender, along with Unreal, Unity and The Armory Engine.

FYI…Material Maker has a Slope Blur included.
https://rodzill4.github.io/material-maker/doc/node_filter_blur_slope.html

1 Like

Yes. Compared to it working for procedural stuff as well same way as it works in Substance. The thing you show is also trivial to do with regular procedural textures (actually, just texture coordinates) and also regular blur works for it without the need of discussed slope blur so not sure what the point is, because you can still use that coordinate displacing thing for that. It sucks, but it sort of works for this.

Also: isn’t the “problematic” part of this that for any bluring you need some info about the “procedural pattern” for the neighbour pixels… and this is not always accessible…

I think the term is (color info) closure ?? In Material Maker the slope is computed by the difference in the “heightfield” and this is used to transform the generated image… while the actual texture coordinate is given by the shader environment…
…but in blender you have to some how give the coorddinated to the “next” node… also the is only info about the actual pixel and you can’t use anything like colorOfGeneratedImageAt(X,Y)

So i think the different shader systems just have their specifc domain language and so shader is not equal to another shader (system)…

Especially the code part from MM:

vec2 slope = vec2($heightmap(uv+vec2(dx, 0.0))-v, $heightmap(uv+vec2(dx, 0.0))+v;

There is no “connection” about any coordinates… and in blender you need them… so how to acess any image position in the normalized range from 0.0. to 1.0 in X,y or U,V or whatever you wanna call them…

This also is my main “problem” with shaders… basically i understand them but i have difficulties to distinguish the system inherent differences… ( mostly because i hadn’t studied them enough :sweat_smile: )

Edit:
I just remembered … in openGL there is mostly used something like fragCoord.xy.. see for example this ShaderToy: blur example

And in OSL ther si (at least for image textures )

type texture (string filename, float s, float t, ...params...)

…with s, t being the coordinates…

IDK how you can do this in the blender shader editor… (acessing neighbour pixels…)

There is a blur node with geometrynodes - but then you need to make your material base setup with geometrynodes and send this with attributes to your shader

As already mentioned…

…but… it seems to be not so simple to get the slope of a texture by “reading” the difference to some neighbor pixel values… ??

You can Bake out a Slope Map using the Mix Node Factor.

" The Mix Node mixes values, colors, and vector inputs using a factor to control the amount of interpolation. In this case the Normals are the Factor.

It is a straightforward node setup that can also be used for a curvature map if tweaked. It can then be used to drive the Blur in the material.

Blur Nodes…

And just the base noise…

1 Like

I see " value to normal" node in texture node editor. I never used texture node editor in Blender and not sure how it supposed to work . But looks like it’s exactly what we would need to get slope direction.

Wonder why there is no such thing in shader nodes.