Bake Wrangler - Node based baking tool set

The main function is really tiny:

for is_alpha in alpha_it:
    if not is_alpha:
        snap = px_offsets + alpha_it.multi_index #snap shot indexes
        hit_count = np.count_nonzero(exp_alp[snap[...,0],snap[...,1]]) # count non alpha pixels in snap shot
        if hit_count:
            pxd = exp_px[snap[...,0],snap[...,1]] #pixes in snapshot
            pxc = pxd[...,-1] > 0.9 # only take pixels with alpha above value
            sum = np.sum(pxd[pxc], axis=0)
            val = sum / hit_count
            pixel_out[alpha_it.multi_index] = val # write pixel

I will past you the rest of the setup code in msg.

I’ve messed around a fair bit with different methods to get to this point. But I could have easily gone completely off track along the way… I think the main slow points are currently the hit_count function and the sum…

1 Like

I managed to get it down to 10 seconds on my system with a single thread. But that’s with summing all the pixels in range, which doesn’t really give accurate results. Summing only x closest is still adding a lot of extra time.

Need to think about how I can do it faster still…

(My benchmark is 2k normal map with 16px margin)

1 Like

Well I’ve got it as about as fast as I think I’m going to… Which is plenty fast enough smaller margin sizes, but if you start wanting really thick margins its not great…

16px: 6s
32px: 17s
64px: 90s

It’s quite fast at rejecting pixels that won’t be inside the margin radius. But it is quite slow when it comes to calculating the colour of pixels that are very far from an edge. So the larger the margin size, the more pixels it has to calculate and more of them are far away…

I can probably add a short circuit for pixels that only have a number of possible matches close to the target sample size to the inner loop. But that will still only speed up the ones at the very edge of the margin.

A solution is of course to put the output back through the process again. That should mean if you go in 16px steps it would take 24s to do 64px. But then the each iteration is using the previous margins colours instead of the original image… Is that actually a problem though?

Here is where I am at with creating margins:


This can be created quite fast, but is done in 3 passes. You can see where the first pass ends quite easily in this image. The border between the 1st and 2nd pass isn’t so obvious, but the pixels have all become a bit blurry. I’ve also picked the most obvious place in the image to show. In other places there is no noticeable difference between a single pass and multiple iterations.

Using this method I can completely fill the gaps in a map pretty fast. On my system it takes about 20 seconds for your average map to be filled. If I fiddle around to find a sweet spot for how many pixels to add every iteration this can be probably get faster by a bit.

I can also create margins of various sizes. The set up means that there is a fixed base time cost, where anything under 16px takes about the same amount of time (5 seconds).

So my question is: Are these margins acceptable? (I guess I kinda need to actually add them to a test version)

Umm. If I’m reading that right, then that margin is not very good. The pixels beyond the edge are just nowhere near the same color as the greenish-pinkish bake. Ideally you wouldn’t be able to tell where the bake ends and the margin begins.

There is a couple of pixels of blue around the edge, so the first band is correct. This is the area before:

1 Like


Here is a better image. Middle is my script, end is blender, start is source. As you can see the colours are similar, but some how mine angle away from the source…

if the left image is source I can’t understand where these pink and green pixels come from. coz the border pixels on the left pic are blue.

I found a reason why udim does not bake in my case. Because i bake in 32 bit and save image in 16 bit. Also, randomly, if take a lot of meshes, some meshes take wrong udim tile and over bake above on not self tile

Why does using 32bit to 16bit cause problems?

I would think it’s most likely if the mesh ends up on the wrong tile, that the wrong UV map is selected or maybe there is something wrong with the UVs? For example a mesh could have a single island that covers more than 1 tile, or it could have UVs that aren’t in the correct range for UDIM. Like having negative UV values or ones outside the tile range…

@unv Those are caused by the pixel having no adjacent colour. When it looks further away it is able to include both the edge and at least one pixel behind the edge in the samples which causes the colour bleed. That said, there is something not quite right with the way the colours move off diagonally. I think the problem is with diagonal pixels getting included where they shouldn’t…

The thing is, I can’t really sample between two pixels. So I have to approximate radial sampling by deciding which pixels are included at each level. I suppose the other option would be to give them a weight, but that’s basically the same thing.

Looking over blenders source, it uses Manhattan distance to weight pixels… Which should also be wrong as it would greatly lower the weight of diagonals… >_<

Yep, so if I use Manhattan distances as weight I get results very close to what blender produces… I think I need to attempt to come up with a more accurate weighting. But right now I can produce margins very similar to blenders for any bake at the cost of around 5-20 seconds for a small margin to a complete infill.

2 Likes

I wrote this little thing to visualise some possible pixel weights: https://replit.com/@netherby/SamplePattern#main.py

Blender uses the ‘Manhattan’ set (which looks pretty wrong to me). I think the Euclidean rounded or ceiling weights look like the best fit…