I currently want to texture a scene and want to achieve the following workflow/result:
I want to use a shader (basically diffuse, specular and normalmap) as a base for the object and add additional details like scratches/rust with additional shaders and apply these with alpha-masks that I paint in the 3D-View.
The point is: the additional shaders already have an alpha-channel which I want to use. So that even if the additional rust-Shader is applied everywhere, there should still be some holes to see the base-Shader. Thats because I can’t figure out how to use brushes in an effective way in blender and want some natural looking rust-holes etc.
The problem is: when I paint my mask to define where the additional shader should be visible, the holes from the alpha-channel are still there, even if there should be the base-shader underneath. This might sound complicated but the image should make it somehow clear.
Do you guys have a good workflow or a different approach to texture a scene like this?
My next approach is, to somehow combine the alpha-channel of the texture and the mask that I’m painting.
Thank you for your help!
P.S. I couldn’t post topics with my old account, therefore I had to make a new one and can only post this one resized picture, I hope its still visible.
I’m not sure I understand. From what I do, you have one texture with an alpha channel, and you want to use the alpha channel to mix between materials, but that texture’s alpha channel is overriding your alpha channel in ways you don’t want.
So? Don’t use the alpha channel. Blender hates alpha channels anyways or else we’d have 4-channel color and vertex color alpha. (Does a great job with alpha sorting though, have to give it that.) Create your own texture and use its white/black values to determine the mix instead of using alpha.
If I’ve understood you correctly, then all you need to do is paint a black and white mask then add that to the alpha from that diffuse texture, then plug that result into the factor of the mix shader:
I made a crappy brick texture that has its own alpha channel for this example. The alpha that is with the texture doesn’t add in the floor or the roof. Inside a new blank image, I painted white in the spots I wanted the texture to appear, then I added that image to the alpha from the brick texture (I clamped it to keep values form exceeding 1):
I don’t know what you mean by that. The alpha is separate because of some technical limitation with shaders (I think alpha has to be a separate shader or something). As for vertex alpha, that was added in a google summer of code project, but they haven’t exposed it in the node system yet.
Wait, now that I think about it, for what you want, you’ll need to invert the alpha channel from the diffuse and subtract that from the mask you paint for mixing the two materials. You just need to make sure that the areas that are transparent in the rust texture are still transparent in the mask.
Well, it’s not really relevant, just talking-- OP can solve the problem using any color channel desired, including alpha.
But I do think Blender has poor support for alpha, both historically and currently.
I am looking forward to support for alpha channel vertex color; I don’t think it really matters if color output is 3+1 or 4 channel (although if there is some technical limitation on combining shaders, not that 4-channel color requires that, I imagine the 2.8 principled shader must have represented a real breakthrough.) 4 channel color would, however, fix what sounds like the major barrier to vertex color alpha support, although I have no idea why that is an issue; attrib node is already the debugging/whatever node, full of arbitrary codes: just send it “Col.a” or “Col[3]” or “VCAlpha[“Col”]”. Since that is apparently an issue, it wouldn’t at all surprise me to find out that 3-channel color is also what held back alpha bakes.
The main reason to avoid the use of the alpha channel right now in Blender is the difficulty of creating an alpha channel for a baked texture, which requires an alpha bake separate from an RGB bake and some swizzling in GIMP or compositing to boot. Means that you might as well use grayscale “alpha” maps instead, because at least they cut out one of those steps.
When I see issues like that that persist throughout versions, while we get things like a square root node (wut), it makes me suspect that the devs must not use those kinds of features in their personal use of Blender, or else somebody would have found the time for them.
Thank you both for your time and effort. I think zanzios way is the right way to do it. But In my particular setup I found out, that a Multiply-Shader that combines the Alpha-Channel of the Texture with my mask works best for me (could be depending if you paint the mask white on black or the other way round). So if anyone else has the same issue, here is the setup (simplified) that helped me achieve the result on the right.