Texturing, Baking, Brush System improvements

please note this topic was already posted in the blender forum, but I was advised to also share it here…

Hello Blender Community!

I am currently looking into ways to Improve Blenders Texturing capabilities. I am a blender artist from germany, that also knows a thing or two about coding and I have also managed to gather another fellow coder to support me during development.
I’m making this thread because of the whole Substance Situation but also because of the Blender Found Streachgoals regarding Texturing improvements and I’m looking forward to your ideas and feedback regarding this topic.

I think the main weakspots of blender regarding texture painting are currently

  • a lack of utility in baking
  • the brush system
  • missing artist utilities such as guiding lines in texture painting, stabilizers, layers (can only be done with the help of node editor currently)
  • lacking interface between texture painting and node editor (therefore lacking support for filters or procedural masks)
  • a lack of presets (i know this is kind of a difficult topic with vanilla blender and probably will need an addon to be addressed)
  • lack of global color palletes)
  • no support for normal brushes
  • improper support for particle brushes

I’ll post suggestions to fix some of thoise issues, but I first want to hear a bit about your thoughts on those topics and also mabye some other problems I probably have missed.


Baking is handled best in add-on form right nwo from different artists that have put up free and paid add-ons.

Brush system is being overhauled, so maybe check the design docs William Reynish posted first before tackling that.

Layers gets brought up - this is something BPainter and BATS add-ons have tried to facilitate through interface representation of the node editor. I prefer the node editor, personally.

Yes, I would love to be able to have a special output node in the Compositor to send the output to an input node that could connect where an image node goes in the shader tree so that I could use the filters and color controls/masking etc. in a live format with feedback in the 3d view. Tickle me Elmo on that one.

Presets - I think the asset manager has to be completed to make sure effort isn’t wasted.

Color palettes? You mean like the ones you create in Texture paint that you can name and save/import? Or does this go further?

Normal brushes? Do you have a video link to how this is going to work? I have seen something once, but not sure how you mean to do it here. That would be very welcome.

Particle systems are beneficial, but something tells me that there is a lot under the hood to make them work correctly, as it is almost like live baking/dynamic paint. I’ve used dynamic paint mixed with the compositor to get different effects, not easy.

We need the UDIM support completed, that will help a lot toward getting some better resolution management.

1 Like

Thank you for your input but before addressing it … first a bit of context (because I think I might have done a bad job saying this) This is not meant to be some kind of blender is bad rant … those is mainly feedback I gathered from speaking with artists and using substance painter. And I just wantedd to throw in those thoughts without properly thinking about implementation and stuff to get a brainstorming/discussion started and gather further ideas so we can come up with the solutions, we actually need.

Now some of my comments rergarding your thoughts:

You’re right, but I think that does not help in selling blender as a good substance substitude.

Right and I think a better node editor/texturing interface is the right way to go. BUT: Nodes take more time to set up and are also scary for some less techy artists. Also a better connection of 3D-view and the image editor might be a good idea.

That’d be soo great. But I fear it might be challenging to implement :c

Global Color Palletes that let you not only plan, import and export color shemes, but also change Colors project wide… There’s actually an addon, that kind of does that but it seems not to be working outside the material node editor, so it’s no use for texture painting.

It’s a thing in substance Painter, where you can use a normal map as brush and it automatically switches the normals around when you paint on a normal map so you can paint from any angle and it still works.

That’s the way you’d have to tacke it right now, if you were to do it as a user. I think if you actually can use blenders backend, there will be better ways to implement this. But don’t worry … Particle brushes are quite laggy in Substance as it is now, so at least blender can’t do much worse. ^^;

I’d also love to hear some features, you’re missing yourself in blender to make it more usable for texture painting.

Yeah, most people don’t realize but the reason people love substance painter is because it’s NOT node based, therefore things are way more easy and faster to do. Nodes are not the solution for everything.


Nodes are in not accessed, but I’d argue that they still describe how the interface is actually working under the surface. To each his own, but there are a lot of UI things to fix if Blender is to mimic their workflow and not use nodes.

I have made several posts on rightclickselect.com but a few :

  1. Let the Mask object in the 2d editor be usable in screen space in the 3d view, or at least as a way to directly control the Fill brush in the 2d editor.
  2. Warp a stencil as it is still in use in the 3d view (texture brush stencil mapping mode)
  3. Visual feedback of the Cavity mask
  4. Face Select Masking in the 2d UV Image Editor (wish they hadn’t split them up now)
  5. Handle Brush Curves and Color ramps/gradients as object data so that they can be stored and named similar to the way the Curve Stroke object is possible to name and recall
  6. Old Texture Nodes allowed me to mix multiple texture types to arrive at a single output to use as a texture brush - with the everything nodes project, I’d like to see this possible again as well as using combination node textures as brush masks with the different mappings
  7. Better in screen support of redraw of texture brush changes - when using an image sequence as a brush, I still see the first image despite having changed it via the timeline. It would be good to see it as it is, to make a better choice in the painting process
  8. Normal Direction painting - there is some work on the sculpting side with adding a brush that changes display to show which direction the normal is and sculpt based on that - I’d love to see this as an option for Texture Paint so we can get better 3d mapping for procedurals

I gotta think more, but this is a few.


@betalars I agree with all your points. I also think this post:


Is really valuable to address one of the main weakspots in Blender texture paint mode, which makes me redo strokes way too often…

Anyway, thanks a lot for having interest in improving texture painting mode. As a fellow Substance Package owner who got really disappointed with the Adobe acquisition I couldn’t be happier. I just wish I could get a refund and donate to projects such as yours.

1 Like

A blender masking generator + UI should be made really simple to use yet powerful to cover any sort of input , that one can be of inspiration https://polycount.com/discussion/183936/enodmi-mask-generator-v2

Thank you. I will have a look into that!

I’m also thinking about blending grease pencil and texture painting but so far only for the sake of having systems work together … I’m not too shure about the use cases so far…

I think Substance Painter is popular because its layer system is so familiar to people who are used to Photoshop. People have been very negative about the whole Allegorithmic/Adobe thing, but it’s important to remember that Photoshop is the industry standard for a reason. If you forget all the new-fangled features and bloat, the underlying system works really well. A layer system that works like those in Substance Painter (and Photoshop) would be the ideal - It would be even better if the user could also play with the node system - Almost like Substance Designer Lite.

Painter shines when you start to add layer upon layer of subtle detail. Being able to build up multiple roughness layers for example instead of having to cram all the roughness info into one image in one go is fantastic.

The other main draw for me personally was Substance Painter’s great generators, noises and filters.

I have perpetual licences for SP and SD and I also have a full Adobe account. But that doesn’t stop me wanting to see improvements and new features in Blender. The more the merrier.


That’s why I think it is so important to have a worrking link between Texture Painting and Node editing. Because all the PS Layer stuff can be made with nodes. And it’d be soo grerat to just have a script, that can automatically create a node tree while giving the user an impression of using layers…


Talking about blending grease pencil and texture painting, we really need Grease Pencil’s curve control of brush settings in texture painting for at least opacity and size, but hopefully for other settings as well. Having to deal with 1 pixel wide strokes whenever I use the incredibly restrictive “enable pressure sensitivity for size” gives me headaches.

Also antialised brushes would be a godsend.


That is the purpose of Slots panel.
If you have ideas to improve it, go ahead.
It is far to be an easy task to manage a node material globally.
We have abilities to plug curves, color ramps, procedural patterns or images into any factor and to make any kind of shader blending.
If you manage to have something that works efficiently for basics principled shaders, that would already be a great success.

Best place for things like that would probably be an editor that is not as specific as node editor : an Image Editor View or Properties Tab.

We can say that there is none.
Particles UI is made to produce particles animations.
Dynamic Paint may use particles but it was initially created to produce wetmaps.
Goal of dynamic paint is also to produce image sequences for animation.
If you want to create particle brush, there is no reason to try to make a connection between texture mode and existing particles UI.

There is nothing by default to queue baking tasks. Without addons, texture baking means one image per object.

Okay but there’s really no easy one klick solution for adding a new Layer, adding Masks, for switching blend modes, for adjusting blending factors or simply for merging two layers. And that’s just stuff, artists rely on in their work and it is not supported in blender.
(And I think a script, that simply like moves image textures around within a shader group node might already be a good solution for this…issue)

Id even go so far and say this shouldt be a implemented within of the “color box.” -> (a first idea by me on the blender Forum)

Yea … budgy do it yourrself support wouldt have been the better term :sweat_smile:

Also the whole Name matching utility of substance painter is unfortunately missing. Especially with the great new layer system in 2.8 I really want to have proper support for this!

There is none because a material is not just made of textures and a unique way to blend them.
In 2.79, Slots panel had to be usable by 2 render engines (BI and Cycles) with 2 different material systems.
In 2.8, EEVEE and Cycles should use same nodes.
But you are not always mixing just 2 image textures.

You can mix a BW painted image with a pointiness attribute by using a math node. You can add another math node after that to mix this result with a gradient texture. Then, you can plug this result into a color ramp and finally use it as factor of a mixRGB node. The purpose of this node could be to mix another painted image, with colors weighted by a RGBCurves node, and vector attributes.

In a node material, the same image node can be connected to different following nodes in nodetree.

So, there is no bounded blending.
You have to ask yourself : How the panel should transmit info of a blending that does not correspond to mix node ? How it should transmit info of a layer or a mask that correspond to an input that can not be painted ?

Currently, solution adopted is to skip all that aspect and just focus on a listing of images existing in material. A simple solution that works in all cases, does not crash and does not reduce material creation to a layering of images.

Of course, this situation should not prevent to develop solutions to build materials by layers painting.
But more technical, mathematical, procedural materials should be able to handle an image that could be painted in Texture Paint mode.

I like zafio last proposal. I think that just lacks a datablock name line at top or bottom of Color Picker panel. But as is, that is still a per color solution. Not really what you show with the Kaleidoscope addon.

I am very aware of the huge benefits a node system has over layers. But it’s just not practical for layering images the way you’d do it within most 2D Applications.
But I also agree with you my proposed solution of just using the material editor for texture painting is not a great idea. Mabye adding a new datatype of a layered image, that utilizes the material/compositing nodes backend for blending and stuff might be a better solution.

Also It’d really be beneficial to discuss this somehow in person.

(And trust me I’m also not happy with my own color pallete system atm and my posts are really not meant to be about providing perfect solutions but about discussing possibilities to make blender more usable for the needs of 2D-artists, that often deal with color painting…)


Regarding the nodes vs layers, honestly I think we need to simply look at what has made both designer and painter a success. Designer uses nodes to design substances, aka smart materials, masks…ect Painter, takes those smart materials designed with nodes, and lets you color in or paint on the assets with layers.

Layers are the way to go when actually painting and texturing an asset. Like in Modo, layers can also exist in the same way an object tree or outliner exist, if this makes it easier to implement. 3D Coat and Substance Painter use the Photoshop approach, which would probably require more work but also becomes instantly familiar and thus more functional to the end user.

Nodes -> designing the things you paint or apply to a mesh, but the layers are where those things exist in order to be easily managed.

ArmoryPaint for Blender probably has the right generalized idea in this regard, taking the best of both and combining them (which a lot of substance users wanted originally anyhow), node & paint. The workflow could probably be a lot better, and deeper but the combination seems to work well.

1 Like

Blender have powerful baking opportunities, imho. Possibilities to bake absolutely any type of texture map, also pack them into rgb channels. Big possibilities to fix texture map in texture piant mode. But with a high grid density and a lot of objects, baking becomes very long. Before baking, the blender has a delay and this is a big problem. In blender internal baking very fast, use all cpu cores and don’t have long delay, like on cycles. But internal dont have cage. If internal baking can have cage it will be big improvement.

1 Like

“lacking interface between texture painting and node editor (therefore lacking support for filters or procedural masks)” + layers system

yeah, thats all what we need