Hey, As Eevee is going to be a PBR designed system I suggest that the paint tools also get updated to allow material creatiion in a proper PBR way.
What I suggest is that right now with paint you can only paint one PBR channel type at a time, E.g you cant set a normal map, roughness, albedo, for a layer and paint on your model with all maps at the same time. This makes creating PBR materials impossible within blender right now.
Having a PBR ready paint tool in Blender would be a massive step forward.
If possible it would also be handy to be able to set a height image that also effects sculpt mode of the surface at the same time (but not overly important as could just create a heightmap layer in the paint tool and apply as a displace in some cases).
Just want to gauge if other users would be interested in this too. And if any dev could point me in the direction within Blenders code base that deals with texture paint mode and the layers system (maybe i can get multiple layer painting done myself, but have no idea where the code is situated).
It would be amazing to be able to paint with materials in ‘real time’ without leading with many configurations when you should change materials and these things. Just choose material and paint, with the possibility of overlapping materials with different blend modes, and using paint layers.
But I really do not understand how other programs for 3D painting work. This is not based on the Vertex system, right? Do they use UV map?
@m9105826, Do you say that some of this is possible to do currently in Blender? Where can I find something about painting alpha masks?
I think it would be good to revisit the texture painting in Blender as a part of the 2.8 series (as the viewport/OpenGL code required to make this work in a way that is smooth and is expected by users will then be in place).
While they’re at it, they could also add visibility features such a symmetry line and an on-the-mesh outline of the brush (as I recall such feature additions being hindered by the ancient viewport code in 2.7x and before).
Add all of your “substances” (pre-built PBR materials with albedo, roughness, metalness, etc.) to a single Blender material node. Cascade them with a series of mix nodes from your base material (skin, rock, bare concrete, etc.) to your upper layers (dirt, dust, surface paint, etc.), plug paintable alpha maps into the factors of each mix node, and paint grayscale on those alpha maps to define your material areas.
The “special sauce” behind the scenes in software like Quixel and Substance is their clever automatic handling of this masking process, and their included material libraries, and obviously the massive amounts of R&D they put into getting painting running as smoothly as possible.
The only big wrench in this process in Blender is that you can’t currently bake a roughness map in Cycles (forcing you to keep your complicated painting setup at render time), you have to do some additional setup if you want to use the actual displacement socket on the material output during painting, and until Eevee is ready for primetime, PBR preview visualization during painting is lackluster.
if you wanted to output the results you are left to bake textures.
Clearly this isn’t as friendly as substance painter!
You could have one image where red channel is for roughness,
green is for metallic. blue feeds a colour mix between two colours.
Then you just set the red, green and blue values in your colour picker to whatever you want for each property to paint in to your mask.
If the paint system is addressed, one major thing is to get it to use Linear instead of sRGB. You can see the problem when painting directly in mix mode with one color into another. I circumvent this by using other texture image slots set to Color or Overlay, etc.
But it does let you set a normal map, diffuse, rough etc and paint directly on a object in paint mode with all images painting simultaneously. As an addon it must be a python hack, but if you can hack this to work even with a addon we should be able to do this correctly directly in the source code. I cant spare even the $15 to test this and look at the python involved, But if someone could and let me know what files it’s hacking i can at least track down where the code is situated in the blender code base.
I’m currently writing an addon that addresses this problem, and it’s now capable of painting into multiple textures at the same time. But the interface has still a long way to go, since defining brushes to deal with multiple layers requires a new approach in terms of UI and code structure…
Won’t it have a serious performance hit for large brush areas once it is dealing with pixels from multiple targets? I know that Armory handles textures differently than Blender, but in my stable Blender I still get some performance hits hen I scale up my brush and paint a procedural mask over say a 1024 or greater area. Just curious about how the brush will feel when painting across the different images.
I had imagined the solution was still to be painting into one image, and separate the channels out to adjustment nodes and then feed them into inputs on the Principled bsdf. In this way, mostly you get some multiple effects for free with not too much of a hit - but you have to use Cycles Material draw mode to get the GLSL feedback, or hope that the paint system will be easier to use with Eevee.
Yes, i’m planning it for free deploy. which means I still need to be working on other projects to get money, which leads to less time to program it. So…ASAP is not an option (unless you want to support it). And yes, I thought about adding to the main code base, but there plenty of inconviniences at this stage (long compile times, difficulties to debug, and the complete other blender stuff getting in the way).
For now, something like 1024^2 in area brush is no problem, as all is done in GPU, and it takes the time of doing the drawing. Also, I’m aiming to introduce vector painting, which will make possible to change final resolution when ready, without having to paint it all over again.