Bake Wrangler - Node based baking tool set

OK, did a scene from scratch, simplified some things -

The purpose of the test was to determine if the normal interpolation makes a difference or not. I thought I’d try flat normals (to avoid OSL having to bother with interpolation alltogether) in conjunction with simple subdivisions, thinking that the distances should be equal in that case. This does not seem to be the case.

My understanding (which could be wrong) is that with flat shading as you get near a face boundary you start getting measurements like this:

Which gives you the ridge, because the height at the edge is always the highest value. While with smoothed edges you get:
image

Yes, but if I’m using flat shading and linear subdivisions that’s exactly what I would want. Higher values at the ridges, because they’re farther from the source.

The only time I would not want that is if I’m subdividing using catmul clark, changing the vert positions. In that case I would want the interpolation (which is giving me a ridge as well, unfortunately)

In any case, this seems to be a limitation for displacement maps, right? I mean they’re not ready to use unless the target object has a higher resolution than the map itself (higher depending on the UV layout).

It depends a lot on how you want to use it… Like if you want to do actual vertex displacement then you will get a sawthooth pattern any time a straight line runs counter to your targets topology. I think this is why Blender only bakes it for Multi-res modifier objects, because this is the best use case when the topology exactly matches and only the vertex density is different…

Well, if I use the same meshes and bake in any other app, than the maps are working correctly. That’s why I was wondering if it’s possible in Blender. I did see the many threads and forums saying it’s not, but figured why not give it a try.

So what are the other apps doing differently? And if you bring the maps created in other apps into Blender, does it look right?

Hahh, I partially take that back. Just tried, and Maya and Topogun is doing the same thing. There’s a tool using Houdini at work which seems to work, and Cyslice was working as well. (With the maps actually looking good in Blender)

I’m starting to get the feeling that both of them are subdividing the mesh before baking, but I have no way of making sure. I’ll try to ask around.

If it’s just a matter of subdividing that’d get us the same result in Blender anyhow, using Bake Wrangler.

Only thing, I’m not sure I’d always want to do that. (subdividing pre bake, that is), depending on the meshes I’m working with.

edit - I checked bot the internal Houdini tool as well as Cyslice which seem to be working for unsubdivided meshes, and they have a separate setting for poly geo (as opposed to subdiv geo), so I suspect they’re consolidating for the difference in deltas in some other ways.

If it’s just subdiv, then I could add a setting to the height pass that sub divides the mesh x times before baking. Then it’s kept entirely separate from your actual object. I don’t think subdividing the object would change bake times that much… The main factor should be the image size, since that dictates the number of actual samples needed…

But if there some other method that isn’t too complicated…

1 Like

Not sure how others are using the addon, I don’t want to inconvenience anybody unnecessarily. Now that I figured out what’s going on I can easily work around it, adding a subdiv modifier based on the model at hand isn’t a big issue. Figuring out what the problem is was the bigger part, especially since with higher res objects it might not be obvious at first. Sometimes in production they spot a small discrepancy, and it becomes a huge thing, debugging across departments for weeks and things like that.

So long story short, from what I saw either putting the delta difference in the target if there’s no subdiv, or putting a subdiv onto the target is how other pipelines solved the issue. I guess as long as there’s a hint in the documentation there’s no pressing need to add extra functionality. However, it could be practical if there’s demand for it.

Going to try it with a production model next, to see how it goes. Fingers crossed. :slight_smile:

edit:

I did an interesting test with a production model.

  • A displacement map for 60 UDIMs with 2k resolution took 16 minutes to compute on a mesh with 215k faces.
  • Subdividing it twice to 3.4M in order to make the dsp look correct took about 1.5 hours.

The resulting maps looked identical to the ones produced by cyslice which is a good thing as they’re used as a ground truth comparison in some places.

Another thing which may or may not be practical, is that the dsp intensity needs to be set to whatever the search distance was to generate them. This is something that one would need to keep track of. With the bake methods we use currently this multiplier is baked into the maps for convenience.

Hmm, I’m surprised the sub divisions add that much time, but it’s not too bad considering OSL means all the rendering has to be done on the CPU. Of course an actual blender internal pass done on the GPU would probably take seconds to minutes, but no one seems to have written one…

How is the distance baked into your maps?

Currently using bake wrangler I need to set the strength in the displace modifier relative to the distance that was used when generating the maps. If I used a distance of 0.1, then my strength has to be 0.1, if the distance was 10, then the strength needs to be 10. With our pipeline tools or Cyslice that multiplier is applied to the maps, so the strength in the modifier will always be 1 (just one less thing to keep track of, in a pipeline).

Regarding the speed - is it using multithreading when baking UDIMs? Is that something that could be added?

So it’s multiplying the value by the search distance or something… That is simple enough to do. Maybe? Is it normalized for an image set or can any independent image with any distance be used?

The current problem with UDIM is that while Blender supports baking all the tiles in parallel, there is no way of accessing the tile data from the Python API. I have been putting off trying to get a work-around to this in the hope that access would be added in the near future.

So currently UDIM performance is a lot worse than it could be as each tile is baked in sequence instead of parallel as that’s the only way I can get access to the pixel data in the current API… But there may be some way of working around it, I just thought it would make a lot more sense for the devs to just add access to the tiles…

The one we’re using are just multiplied, not normalized. It’s kind of convenient because we don’t have to set any values - on the other hand it might need painting out spikes.

Makes sense about the UDIM performance. It might not be worth investing a lot of time in workarounds if there’s a likelihood of baking getting some focus anyhow (by looking at the targets for this year this might be the case).

Most of the time it should be alright, as in a lot of pipelines the one’s baking the maps are not the ones using them (in VFX studios at least model departments would bake the dsp maps and shaders/lookdev/textures departments would be the ones actually working with them) so there’s always a disconnect with the scheduling. It’s just cases where there’s an emergency, like “can you quickly re-bake UDIM 1003 and 1045 as there are some artifacts and we have to kick off renders by 18:00” which can become an issue.

Thanks for making this amazing addon.
Is there any way to save bake recipies as presets which can be used in every blend file?

Is it possible to acess the baked textures in the post bake script?
I wanted to automate the process of creating a new material using the baked textures.

If they are just multiplied, you could also add in a multiply post process node and set the value to what ever the distance is. So long as the file format supports values greater than 1, it should work fine… I could add the option to the bake pass, but you should already be able to do it with the post nodes so probably no reason to.

The UDIM situation does need to be resolved, especially because baking the tiles in sequence instead of parallel causes problems when you need the values in all tiles to be normalized with each other…

Hi @SAIINDIA, currently you can save bake recipes into your start up file if you want them every time. You can also use the append function in the file menu to import them from other files (they will be in the node_groups folder). The only thing to note is that objects set in the nodes will also be imported if you do it that way. I think the blender asset library will support node groups soon, when/if it does I will add them to that. If that doesn’t happen I will probably add an internal recipe library system, but that’s a version 2 thing…

Currently the post bake script gets the list of target objects and source objects. The saved files could also be added. What data would you want to have provided to the script? I haven’t added much to it simply because its not something I personally use and no one has requested anything :stuck_out_tongue_winking_eye:

1 Like

Absolutely, that’d work. That solves that part then. :slight_smile:

Fingers crossed the UDIM part gets some attention this year.

Thanks for all your help!

No problem, I’m always open to making work flow improvements where possible. The UDIM thing is just something that the Blender team SHOULD fix instead of me spending a bunch of time trying to get a work-around which will be obsolete as soon as they fix it… The current UDIM handling is already a work-around and is honestly kind of a nightmare with how much complexity it adds. I would love to be able to get rid of it and use a properly implemented api instead!

1 Like

Hey @netherby

I’m working on some skeletal mesh asset to export to Unreal engine 5. With high poly to low poly logic in blender :

I encountered issues with Textures Normals map on PBR baking, wrangler view nodes :

Alls Normals baking i tested work fine ( PBR géométric normals, Blender cycle normals, and Wrangler Bent normals ), and UE 5 recognize these textures map as normal map, excetp PBR texture normals recognized as Default color by UE 5 when import :

The texture seems to be wrong color ( too light ) isn’t it ?


PBR texture normals bake.

PBR Geométric normals for comparison correctly recognized by UE 5 as Normalmap


PBR geometric normal bake.

And in UE 5 when implement PBR textures normal map in material, artifact appear, some darkening :

For comparison when PBR geometric normal is implemented in material, there are no artifacts :

Even if I try to combine geometry normal map and texture normal map in some way, i don’t obtain good results, with subtraction or not in pbr textures normal map…

Any ideas ?

Thanks.