Texturing, Baking, Brush System improvements

I did not want to discourage you to make a layer system. I was just warning you that connection between a Layering system and material node editor probably can not be absolute.
It does not mean that partial/relative one can not exist.

Currently, Slots panel have 2 modes :

  • Material Mode, listing all images used by current material
  • Single Image Mode, to allow to paint an image that is not used by material but that is used to influence effect a modifier or to tweak particles.

We can easily to replace actual material mode by several material modes.
An absolute material mode working as current one that could list all images in material.
And a layered material mode that would only support nodetrees made of a principled shader + mixRGB nodes, with an admitted known limitation of not being able to handle all nodetrees.

So workflow would be : when artist starts to paint to prototype the asset, he would roughly use this layering system.
When he wants to become more technical, to insert procedural effects, to decouple things into different shader nodes, to emphasize some things with math/curve nodes : he stops to paint and modifies the nodes.
And then, to finally adjusts the image textures, he jump back to absolute material mode.

Your last post reminds me that there was also an old idea of supporting painting on a layered image format like .ora. All textures of material could be layers stored into one .ora file.

Guys just check Multitile udim addon by Antonio Mendosa


It is what we need with additional features and nicely planed so there is no need to have node editor opened. Everything should work under the hood.

1 Like

Have had a little dig/whinge about how normal map baking in 2.80 is generally taking me almost double the time to get the same result I could get in much less time using 2.79.

Thinking of just sticking to 2.79 until something is down about all the additional steps needed to bake textures like Normal maps using Cycles in 2.80

FORUM POST LINK

I have never said a lack of features in baking. It’s about a lack of utility in baking.

So yeah you can pretty much bake anything, but it takes just way to long to do so. You can’t bake multiple textures at once, it is a big challenge to replicate Substance Painters name matching to prevent undesired projections on objects close to each other, you can’t bake multiple objects into the same image texture and the process of adding a new image Texture, selecting the correct resolution, inserting it into the correct material and having it active at all times during baking is just tideous.

And this is time you can’t afford to spend in any kind of professional production unfortunately.

Okay that kind of looks like a step in the right direction and fits one of my ideas nicely. But I still don’t like the idea of this being an addOn. We want to move people from substance painter to blender. And I don’t see this happening with some “weird” addOn

I’ll try to gatherr some thoughts, ideas and solutions soon and list pointed out downsides of those solutions. (just give me a bit of time, I’m currently writing a thesis. ^^;)

1 Like

This is an odd hangup to have considering they add good addons to the Blender distribution all the time.

I don’t personally think addons are a bad thing. In fact I love, that there is a blender addon for pretty much every problem you might encounter during using this software.

But I don’t think addons are that great for “advertising” a software as a valid alternative :c

At least that’s my impression based on opinions, I’ve heared from non-blender using artists when it comes to using blender: If texture painting is supposed to be a strong feature of blender it has to be properly done in “vanlla blender”.
Because as it is now, everyone thinks you don’t want to use blender for texure painting and I don’t see this opinion change with the help of an addon unfortunately.

1 Like

The biggest issue with Blender texturing is missing UDIM support. If you are coming from another application as a texture artist, you will feel like you just downgraded to 1999 all over again due to this particular matter.

Other missing texturing features can be compensated this way or that way, I can even paint my textures in MS Paint, but lack of UDIM cant be compensated easily, especially if you are contracting for a studio that expects UDIM textures from you.

I just did a project where I had to use many 8k textures in Blender because of lack of UDIM forced me to go that way, and it is not fun production wise. And single 8K (per layer) texture was not even enough resolution wise but we had to deal with it.

2 Likes

There needs to be a way to paint to individual RGBA channels. Using RGBA as mask channels is the norm for doing 3d game modeling development. This is a major omission in blender. You should be able to select/lock channels instead of having to create 3 or 4 separate textures and merge them externally with Photoshop.

2 Likes

There is a way using nodes, where you create three textures and use an rgb curves node to disable the other channels. A layer manager as discussed here (and similar to BATS or Bpainter) could easily do that behind the scenes, without any work in the Blender core.

It’s important to note that nothing else works the way you describe. Substance Painter just combines different textures into the channels during the export process

1 Like

Yeah … I think the main concerns regarding this topic can be solved by imporiving the UI. In fact for most of my own complaints I know a bodgy and (and sometimes a bit time consiuming) solution, you can pull of yourself with minor scripts or clicking.
A big plus of blender has always been the big abmount of control given to users. (I’ve even porposed a bodged UDIM support in a private message)

Big changes to the core will likely only be nessecary, when it comes to baking, brush system and stabilizers.

1 Like

I would welcome the use of cloning tools for texturing. Many times, I just want to clone a texture to cover a seam. Pick a ‘layer’ and clone pixels from another nearby part of the mesh, or have a scaleable overlay of an image that can be pasted in place. These are the elements that I do ‘elsewhere’ along with what has been said on texturing in general.

You can already do that.
Blender has a clone brush and there is the Stencil mapping mode for texture brush.
There are tiling options (in Image Editor, in paint mode, when repeat image option is ON) that have to be restored from 2.79b. They will probably come back for 3D View, too like they are present for sculpting.

I am confounded by my ignorance of this. I had thought that I had Googled the topic, but apparently my inquiry didn’t bring any answers. I have now watched several videos, thanks to you. Back to work.

  • Maybe seamless noise generator? (regular, musgrave, etc).
    Actually you can do it in a torus geometry (https://www.youtube.com/watch?v=CJUzDoK_PrM but is time consuming and not so accurate.

  • An option to bake multiple maps at once, ie.: diffuse, normal, height, ao.

Thank you so much for the torus idea … this is a somewhat usable solution for a problem, I wasn’t able to solve for a long time!

Hi betalars.
It is interesting, to see you writing about some Ideas.

You wrote …

Mabye adding a new datatype of a layered image, that utilizes the material/compositing nodes backend for blending and stuff might be a better solution.

A layer which uses compositing would also enable OSL pattern generation. Very useful is a shader, which renders objects to a heigh map.
I had a similar idea, and I tried to write about it. I’m using composite together with OSL and “Height Map generating shader” for texturing.

On this approach, textures need to be rendered, before they can be used. But, it gives flexibility, playing with OSL parameters or nodes. A node tree takes much visual space, shrinking paint area, and it distracts from painting. I would love to exchange some ideas about a way, which does not require node editor to be opened during painting. (But still can be used, if desired.)


some ideas …

Actually, I think about a kind of node tree, which works like a composite editor, which is specialized to create brushes.

Input Image could be an image file.

Input Image could be generated from a scene (local or from an external blender library?) which would be rendered on-demand. Via scene rendering, we get access to OSL pattern generation or height map generation.

With image file input or generating image from scene alone, Node system would be limited to static image compositing. I like to discourage to introduce ‘standard procedural’ pattern generation nodes.
Is it possible to have languages like OSL/SeExpr enabled for compositing brush nodes? These could be used, to create procedural patterns without the need to render a scene.

Inside node tree, parameters could be defined. The parameters works like an abstraction layer. An artist can use these to control the brush. Parameters help liberating artist from the need to keep the node editor opened.


Its just a collection of ideas. But, what do you think about it?

Happy blending.

1 Like

First of all I’m sorry I’m not going to be able to answer you with suficient depth, since progress on my bachelor thesis currently is alarming and I can’t distract myself too much right now. In two weeks, it’ll be better and I’ll be able to come back to this.

In general I think a lot of your ideas are great and I’d love to see them working in blender. I also reccomend you to have a look at "Black Ink"s brush engine. It also utilizes nodes to create brushes.

I am sorry, did not knew moment was not good. Please, take time you need.

Black Ink (Windows-Only) does not work on my machine, Linux here. But there are some online youtube-reviews. Generative brushes, thats it.


There are some experiments I did last year, using OSL and Appleseed.

Link is for Blender community. Some images, they show procedural generated structures, which could be used for brushes.
If you have time, you are welcome to visit “Fun perturbing texture nodes”.

That way I already used OSL to generate brush textures.
If someone likes to try …

We do need a scene for a ‘static’ generative brush.

In this brush generation scene:
Composite Editor : - : Beside Composite-Output Node there is a node “File output <Filename.png>”

The other scene you (community) use for painting.
Texture Editor (Brush Texture) : - : Node “Input image <Filename.png>”

Refresh rendered Brush requires scene switch.


Good luck with your bachelor thesis.