Bake Wrangler - Node based baking tool set

It’s kind of ridiculous that Blender is so bad at scaling images in the first place. Best place to fix this would be in Blender itself. But knowing the devs, that would take a year even if you sent them the patch.

I can actually change the blender interpolation to be ‘cubic’ (no idea what that actually means)… Personally I can hardly see any difference, but it does produce a different result, so I will switch over to that!

I’ve also added a ‘Fast AA’ option with 5 levels that doesn’t require you to use different sample sizes.

Right now I don’t feel like I get enough out of adding some image manipulation library to be worth the hassle. Though running a simple command line tool over the result wouldn’t be that difficult.

I think try out the new version, if people think the quality is a big problem we can look at how to fix it… You may or may not find the fast AA on top improves it enough :stuck_out_tongue:

Some years ago, when I made Meltdown, another baking addon for Blender, I did the downsampling in the compositor, carefully controlling each sample location and for normal maps, renormalizing. It was kinda overkill to solve such a simple problem, but it worked.

The downsampling just isn’t really working in the current system to be honest when I look at the results.

I’m going to have find a different way of doing it. Though I can actually get much better results without downsampling… Which makes me wonder if I should remove the ability and replace it…

Downsample 4k to 1k:

Interpolate 1k to 1k:

Fast AA at level 1 (maybe need to tone it down a bit):

I can probably fix the downsample, but is it worth it when the other two methods give pretty good results and are way faster? I guess I need to fix it and add the other two methods to the next version and see if we need to keep all of them…

I’m not sure how marmoset is doing it but it’s not downsampling i think they have an actual AA algorithm because a 1k bake with 16AA takes like 2-3 seconds to bake. or it could be a mixture of them both some sort of downsample and custom filtering.

Currently im having an issue where downsampling normals are making them really blotchy

So I’m going to be putting 3 changes to this in the next version:

  1. Improving down sampling by performing pre-scaling of inputs. In my tests this is a big improvement.
  2. Adding the option to use cubic interpolation of bake pixel to output pixel regardless of scaling. This gives a light anti-aliasing effect with good color accuracy.
  3. Adding a ‘Fast AA’ option to the final output with a few levels to give a more heavy handed anti-aliasing effect, but with some loss of color accuracy.

Both 2 and 3 will add negligable time to the bake and may make down sampling less important…

Sounds good can’t wait to test it out!

I will upload it now. It also has the changes to the cage buttons and distance input as discussed.

It seems to me enabling the interpolate option helps a lot when down sampling also.

I’ll try it out now to rebake my textures.

I’m currently feeling annoyed by the size sliders… I’m pretty sure no one uses a slider to set their bake size…?

What if it was just a single text field where you write the size with some automagic? Like if I write 4k it will set it to 4096x4096. Or enter 2k x 1k and get that? Basically it would accept numbers, ‘x’ and ‘k’ to create the size… If there is no ‘x’ then it’s assumed to be square…

is there a drop down that takes text inputs ?

I think you can put a text input inside a drop down, but not as the drop down button itself… Why?

Sounds good to me, except switch out the x for *. I know it’s not technically multiplication, but it’s probably better to be consistent with the rest of Blender?

On second thought, I don’t think Blender ever presents the X and Y dimensions of an image in a single string, it’s always two fields.

How about this: Have a text field for x and y each, and two buttons next to each - Double and Half (or just plus/minus for brevity). The idea is that if you have a default of 1024, any common texture size is just a few clicks away without moving your hand to your keyboard.

… We’re really being quite nitpicky users here, aren’t we? :stuck_out_tongue:

This is not a bad idea because it’s always a power of 2

New settings are working really well my bakes for normals are much smoother now.

Hmm, I like not having to use the keyboard at all! If only I could change the sliders to increment by powers of 2 or something… But having a button to double and halve is probably the closest we can get…

@Lamia what parts of it are you using? I was wondering if I should always enable the interpolate option when downsampling and hide the option or not…

I would keep it an option having more settings is always good but so far I have been getting really good results with interpolate on and aa fast 1 going from 4k to 1k, for hero assets I’ll even bake 2 maps one with no interpolate and fast aa 1 and one with I’ll take them in to photoshop put a mask on the sharp so I can control what other bits of detail to have sharp, when I get to my pc I’ll show some pics.

You can’t do that directly, but you can have an integer slider for the PoT exponent. It’s what Substance Designer does. But I think buttons are more elegant.

Is there any render threads limit in Bake Wrangler becouse I noticed that is using 2-4 threads when I am baking?When I am doing manual baking all the threads are used and this make a huge difference when you got more cores.Now if I am using Bake Wrangler there is no difference between 64 cores Threadripper and 8 core laptop.This happens only on the latest beta

1 Like

Hi georgiM, I believe the problem (if it only happens in the latest beta and not previous versions?) is the render tile size… I’m pretty sure the possible number of threads is related to the number of tiles.

For a lot of baking the wisdom suggests that a single tile the size of the image gives best performance. I don’t know how true that is, but it is what I’ve been told and read in various places.

In the last beta I secretly changed this to be the default. But this actually causes you to lose performance?

Performance is something that is on my radar and at least one user wants parallel baking. While there is the ability to take render settings from a scene instead of using my defaults it’s possible that I need to expose some of the more useful settings in a node. Probably the tile size and threads setting…