Modular texturing question: Masking out parts of a stone wall (or similar texture)

(Warning: long post and potentially moronic idea. Proceed on your own risk)

So I was thinking about the following situation. Imagine we have material consisting of two layers - some kind of “base” and a layer with discrete details - stones, lizard scales or whatever.


Layer B uses a mask to reveal layer A

But here is the catch: individual stones must be hidden or revealed based of where their center point positions fall on mask , otherwise it will look ridiculous

Setup must work with different masks and most likely with different tiling values of A and B, so painting individual bricks on the mask is out of the question

I got an idea of creating the “centerpoint map” for the bricks.


It has their centerpoint UV positions marked as 2d vector (RG). So using this map as UV input for the mask texture should solve the problem, as long as I can handle issues with tiling. But obviously, since texture lookup by specific UV points needs to be performed, it will only work with image texture as a mask. So the question is: can this method be adapted for vertex color mask instead? It would made it more flexible

You can paint image 2 and bake to image texture. You don’t need any UV coordinates (although you need to be aware that Blender treats vertex color RGB as sRGB data, so you need to work around that by using srgb->linear conversion in nodes, or by using only alpha channels as your coordinates.)

Hmm, “bake” like using a baker beforehand to convert vertex colors into texture? Or you mean Blender supports something like render targets https://docs.unity3d.com/ScriptReference/RenderTexture.html
and can actually bake intermediate maps at runtime?

Blender can’t bake at runtime, no. You can write vertex color to texture pretty easily by creating a vcol->emission material and baking emission.

Ah Ok. Kinda defeats the purpose, but might be useful in some situation

I’ve set up actual example in Blender, and it kinda works

Though a weakness of this method also became evident. You cannot “wrap” colors that mark centerpoint UVs around tiled texture’s edges. So far I do not know how to overcome this problem without making texture tiling somewhat obvious

You should be able to. But you need to use nearest sampling to avoid mipmapping artifacts, and the devil is in the details of your implementation.

In any case, there’s really no reason to prefer vertex color here anyways. Eventually, you’re going to bake it to all vcol or all image texture, and you can go from either one, to either one.

By the way, is it technically possible to recreate the functionality of Substance’s Tile Generator

using Blender nodes? Mainly, moving individual tile samples relatively to eachother , like pressing them closer together? Or it will require something which doesnt exist as nodes, like for loops?

I’m not familiar with SD. It can probably do something that Blender’s nodes can’t-- it can evaluate an entire image at once, whereas anything Blender does, it has to do for every sample.

For example, let’s say you want to pinch an image, using an image to represent the pinch. SD can say, “What is the brightest pixel on our pinch map? We’ll scale toward that point, by a factor that scales with the brightness of the pinch map.” Blender can’t do that. It can’t afford to evaluate every pixel of the image, for every single sample.

There are some useful techniques that can get around this, to some extent, in Blender. As you’ve discovered, you can make images that don’t represent color, and/or aren’t referenced by traditional coordinates. You can also shape your coordinate space using math. Finally, of course, you can bake textures, which allows you to use other tools, like compositing nodes, to evaluate the entire image at once, but yes, that’s cumbersome. For me, it takes some thinking to reconsider problems I want to think of as serial problems, that do a series of steps on an image, into parallel problems, that approach the problem from the perspective of a blind, deaf sample, who can at best sniff out a gradient in an image.

I see. Well, evaluating the entire image is a little too advanced for me, havent thought about it yet. So far I was interested exactly in this Tile Generator utility. Since something like it would solve the masking by centerpoints’ problem even better. I was able to put an instance of a texture into a set position in UV space using Blender nodes, but the main inconvenience is duplicating it. To add a new “brush print”, I have to copy my node group each time and add/mix the instance cumulatively with previous result. Won’t be an option if there are hundreds or thousands of instances needed. Thats why I mentioned for loops. Likely, Substance does it by script, and re-evaluates nodes on the background every time.