Bake Wrangler - Node based baking tool set

Hello @netherby, I tried using the Input Material node for baking tileable textures but the output texture is always black.

Blender 3.4 / BakeWrangler 1.3.6

I did a small test by just using the default bake recipe (then using the input material node) and tried baking the albedo of a random pbr material. I’m not sure if I’m using the nodes correctly.

note that the plane with the material is just for preview

I already tried:
-Switching Render Devices (CUDA, Optix)
-Different color spaces, resolutions
-Different PBR Passes

My desired use case is to bake/flatten tileable texture set from an uv’d object with tileable textures with procedural overlays.

It’s possible that node got broken by Blender 3.4, I will look into it!

Okay… So I don’t know when or how this happened, but an empty material slot some how got onto the object that is used to bake materials which displaces the material you actually want to bake…

You can fix this in the current version by opening the addon\BakeWrangler\resources\BakeWrangler_Scene.blend, click on the BW_MatPlane object in the outliner and delete the empty material slot from that object. Then save the file.

tried the fix and it now works!

Thank you!

This might very well not be bake wrangler related, but I’d figured if anyone knows if I’m doing something wrong it’d be here.

Trying to bake a displacement, but it seems to have some issues with the smooth face interpolation. Any ideas how to fix this without subdividing the hell out of my target mesh?

If you turn the fast AA on and up to like 9 it will start to remove those face grids…

I’m probably using it wrong. If I turn it on and putting it up to 9 (max) then it just seems to downres the texture and upres it again which get’s rid of my detail but doesn’t solve my poly artifacts.




Is this a “can’t really be solved, just how Blender works, known limitation” kind of thing? Or would the dsp-s look alright if done in some other way? I tried some really convoluted way ages ago with emission shaders and whatnot, would not want to go that route again.

Hi,

there appears to be a bit of a bug in Blender 3.4:
https://developer.blender.org/T104301

When I search nodes to be added in the shader editor, the search ends up finding some of the BW nodes too, causing problems. I think this may be caused by the fact that Blender 3.4 has deprecated Mix RGB node and replaced it with generic Mix node which allows switching data types. The fact that in BW, the node was named the same may be source of these troubles.

How are you using the generated texture? Have you tried setting the target object to use smooth shading? The normals of the faces on the target will effect the resulting map.

Hmm, but this seems like it’s a bug in Blender… Because nodes from other trees never used to show up in the search…

Yes… but I think it would be more appropriate if you reported it, not me?

It indeed appears I can create pretty much any BW node in the Shader Editor:

The target object does have smooth interpolation, and no custom splitmap data.

I’m using the generated exr in the displacement modifier.


Screenshot 2023-02-03 061347

It seems the interpolation method of the generated map differs from the one in the subdivision surface modifier? No clue, just spitballing here.

Here’s a simple UV sphere test with a couple simple cubes, switching interpolation methods. It’s the same result as if I was to generate it to hard edges, so in that case there’d be no interpolation at all -

edgy

I’m just doing a quick test here to see if displacements generated in Blender are OK to use in production, but it’s a rough start.

It’s not a modifier issue either, same thing happens when I use the dsp in a shader and render it. -

edit - Ah, ok. After some digging this seems to be a known issue. There’s even a 4yo rcs thread. I guess this might be related to some missing architecture on the Blender side then.

Blender doesn’t internally support baking displacement for objects that don’t use the Multi-res modifier system. But we get around the limitations of the internal baking using a combination of specialised materials and OSL based shaders. The height map is using an OSL shader to get the distance between surfaces.

So, it’s possible it can be improved in some way… The results I get look like this:
image

You got to turn on displacement in the shader options -


It’d be great if it could be improved. Works well if the target mesh is subdivided to match the pixel resolution, else the displacement has these interpolation steps. I’m sure it’s just some method in OSL that needs to be switched, but it’s a bit out of my league. I’ll try to read up on it.

The problem might be in the range mapping… The distance has to be mapped to a value between mid level and 0 or 1… Maybe the floating point precession is not high enough

Probably not floating point since they seem to be independent of scale (tried it with a couple scale values to be safe). Since edge hardness is making a difference, I guess the way it calculates actual difference vs interpolation is what’s causing the issue here.

Would be interesting to know. The team at the company I’m working for was just looking into a baking solution using houdini, that’s when I thought I’d try node wrangler for dsp-s, since it’s pretty handy with multi object baking.

https://developer.blender.org/T101259 this report catains the fix, which is that template order needs to have the Blender Node class last. So that will require refactoring all the node definitions.

Changing the midlevel should also show if the value accuracy is a problem. If the midlevel is set to 0.05 then you would nearly double the space for values above the midlevel.

But you think it has more to do with the angle of the normal on the target object?

Yes, definitely. The maps look alright so there’s no stepping visible by visually inspecting them. I guess the normal interpolation is taken into account when calculating the distance, and the way it’s calculating that is different to the way the subdivision modifier is actually changing the vert positions.

Not sure what options there are on the OSL end, to calculate the distance, but also take into account interpolation.

Basically how it works currently is the shader gets the shading point and the normal to the shading point. A ray trace is then performed from the shading point along its normal vector. The points normal vector is coming from Blender, which is why changing the shading method changes the results. The distance is a floating point value of the distance to the hit.

Assuming the problem is not with very similar values getting rounded to the same number, then the problem is likely with the vector used for the trace. But I’m not sure how those vectors differ…

The shading point would be the given point on the surface?

If I take the same sphere, subdivided a couple times (basically making a poly as big as a pixel on the map), the result looks correct. So yes, it’s probably the normal interpolation that’s causing the difference.

Wouldn’t flat shading used with simple subdiv (not cclark) need to get us a correct result? for some reason that’s even more jaggy. I need to re-build my simple test scene as now I don’t see any difference between smooth and flat shading. Changed too many settings back and forth I guess.