Bake Wrangler - Node based baking tool set

Actually, I’m not really sure how the High to Low node works or what it’s for. :sweat_smile:

I just do this:

It’s not a big deal, but I’d love it if there was maybe some intermediate node you could put halfway between the objects and the passes to get rid of that spider web in the middle. One that connects each of the inputs to each of the outputs, but retains the information that they have to be baked in turn.

That or another solution to the web. It’s just not very readable.

So the current version could replace some number of your Input Mesh nodes with a single High to Low node. But you would still need a different one for when you needed a different ‘Scene’ input. I also hadn’t really figured out how to go about matching a single target to a group of high poly sources. But we can come up with some means of doing that…

Basically I would want your above tree to have a single Input Mesh connected to a High2Low node that would then connect to the passes and be able to figure out how to group the objects.

My current implementation only works for 1:1 mappings and also goes in place of the Input Mesh node (which I think I don’t like). So it could map all your LPx to HPx objects automatically. But no the groups of HP objects. Supporting the groups could be done via matching Collection names and/or some set of naming conventions.

That could probably extend to the Scene input as well…

Now that we are talking about graph management, could we maybe get support for node groups?

Its not really just management, it potentially saves a lot of time if you have many objects.

I believe node groups are technically possible…? But there isn’t good documentation about how you’re supposed to actually use them… Due to that it’s not a high priority item, more something I could fiddle around with when I had nothing better to do.

What do you need node groups for? My intention with a lot of the changes is to make graphs simple enough that you shouldn’t need node groups! :stuck_out_tongue:

What would you want to encapsulate in a node group though? There’s not a lot of things you’d want to reuse in baking.

If it was simple to do I would add it any way… But it’s not a simple task…

Even IF the process of creating custom node groups had documentation to follow, I would still need to change a lot of stuff to deal with the possibility of encountering groups,

Post process stuff

But if its not as easy an easy task then its okay without👍

I spoke to the person who makes that maths expression node and they were happy for me to use it. So I will probably look at how I can use that in the post processing which should help you a bit with reducing nodes.

1 Like

Edit: Nvm this, I forgot to setup my collection so it probably baked the wrong stuff - I thought I was baking one mesh :sweat_smile:

It this by intent?

the latest developments are really great… I got it to work with UDIMs by manually assigning image names… but it would be great to have it automated… e.g. if an object is using UDIM tiles… it automatically gets the proper UV map and bakes in to the correct image tile…

I have a concern regarding the new output image path node. I love that it lets you bake multiple images without having to launch a new Blender instance like the batch bake node but currently if you wanna pack some sRGB channels along with some Linear stuff you kinda have to create a separate pass node for it.

Would it be possible to add a new post process node that applies/translates from linear to sRGB so that one could do the sRGB conversion manually per texture that gets created by the pass node (or maybe even per channel? Tho I do realize most 3D suits wouldnt let you unpack sRGB on a per channel basis I would still like to see support for this)? Otherwise I believe this would already be possible with the current post process nodes that you’ve added, but it would be a little tedious to setup especially since we dont have node groups (like if you would need to do the same packing operations at multiple places in the setup) :thinking:

Also another feature request:

I would like to see the ability to bake each material on a object into a tile-able texture as an alternative to baking all materials to one texture.

This would be useful when making larger scenes where you have a lot of materials with “messy” details like some noise plugged into the normal as a heightmap or some random grunge (for instance).

Slightly off topic: I tried baking my corridor scene yesterday and were kinda struggling to get good results partially because I had used materials where I had cranked up some PBR values above 1 (resulting in bakes that didnt ressemble my materials properly beacuse it seems you cant bake PBR channels that goes above one, it will just clamp in the 0-1 range (please let me know if is possible to get around this somehow) ) (I gave it a try baking diffuse instead of albedo in an attempt to counteract the loss of info due to this with my PBR channels being above 1 at some places, but I didnt manage to get any results that wasnt super grainy even with 40 samples (how high should I go when baking diffuse? I had to go with like a 8K texture so it took a little while to bake with these settings even, but it was so grainy 70% of my image were still black where it shouldnt be lol)) - So rn Ive kinda just surrendered my idea of baking this scene at all and just learn from this to nevr use pbr values above 1 again (altough I do would find this slightly limiting as sometimes I end up in a situation where I lets say have already set my specular to 1, but kinda want it higher - if I wouldnt be able to increase it here my only solution would be to make the albedo brighter, which isnt quite the same).

But so back to the tiling feature request; The other thing I had issues with was getting decent close-up resolution, therefore I had to bake 8K textures (and in my case that was just barely enough, maybe not quite even). So Im thinking maybe it would be nice if it would be possible to bake each material into a tileable texture (Im having some noise drive subtle variations in the normal/specular/roughness/albedo) for larger environments like my scene where theres no true need for baking for every surface since Im not having any unique detail anywhere.

Now of course it would probably result in quite visible seams for most materials but maybe one could use some post process effect that “mirrors” the ends of the texture (like when U or V is getting close to 1 or 0 it will start blending with a mirrored version of the same texture to make the seams less obvious) to make this less obvious.

This is the scene in particular (download link if anyone wanna give a go at baking it https://u.teknik.io/V42cl.zip (its large in size because it includes several high res images - my failed baking attempts) ) :

Sorry about the very unstructured post😅

Also I do realize I could probably get my texture size down to something more acceptable just by unwrapping each object before duplicating/arraying/mirroring it to make sure they all overlap and share UV space.

But in a scenario where one had an even larger scene with a lot of non-repeating geometry, like a giant factory or a giant beach - making something like tile-able textures more feasible.

Could you clarify what you mean here? If your materials are tileable, don’t you already have them? Or do you mean procedural materials? In which case, if they’re tileable, all you need is to throw them on some planes and bake onto themselves?

I would mean procedural shades. Lazy as I am the process you described is what I would like to see automated <3

Ah, I get it, that makes sense.

I’m assuming you mean you want the inputs of the final Disney shader baked.

If that’s going in, I’d sort of like to see this more general though. It would be nice if we had a special node in the Material graph that could serve as an output node for baking. You could then plug something into it at any stage af the shader, not just at the PBR inputs.

The use case for this would be for optimizing shaders that you still want to keep mostly procedural. Like when you want to use the AO nodes for some parts of the graph but don’t mind baking down a bunch of noise textures so they’re faster.

It would be even nicer if this node had an output that automatically output the resulting baked image if it exists.

Something like this:

How feasible this is, I have no idea though. Just loose thoughts.

I think that’s a nice idea though I would also like to see this directly in the bake wrangler node graph so that you don’t have to modify your shaders one by one when wanting to do a final Disney shader (heh) for several materials in one go

I’m glad you like the idea :slight_smile:

Or otherwise, if it’s possible to have it be a node in the shader graph like Piotr is suggesting, maybe add some button to automatically insert that bake node in every shader for an object and have some button somewhere to let you toggle between viewing the baked results and the procedural (globally for all shaders)

I think as long as having to insert those bake nodes for each shader graph won’t become a labour of its own, maybe it would be the most ideal solution

@amdj1 I hear you. The problem is that currently Blender doesn’t have any data on an object that specifies if it uses UDIMs or not (unless there is something I’m missing? But as far as I can tell, only the image itself has any data about being a UDIM and no link between objects using it exists). So at this point it can’t be fully automatic. You would at least need to specify somehow that an object was using UDIM and if you wanted to use names other than the defaults (1001,1002,etc) you would also have to an image that had the tile names you wanted…

@Willy_Wonka The color space thing sounds reasonable… I could just give you a ‘gamma’ node? That would let you do the conversion but could also be used for other stuff… I’m not sure about individual channels, it sounds like it should be possible but I would have to test it.

Regarding the other stuff…

Putting a material on a plane and baking it isn’t particularly difficult. How I would actually set up the interface to let you specify what to bake though… I suppose it would have to be a bake pass type, the texture names would have to be appended to the file name.

The problem of course is that this won’t result in a seamless texture. There are procedural ways to make it seamless. I don’t know any that create a particularly amazing result. I’m also pretty sure it would take a horribly long time in an uncompiled python script to apply them to any decently sized texture…

I’ve thought about adding shader nodes for various things before. There are some advantages, but it also has the issue of then having to mess around with your shaders which is one of the things that is really annoying about bake in vanilla blender…

I can see how it would be useful for prototyping your shaders into textures.