Merge Roughness, Glossy, & Normal Images to One Map?

After finally figuring out how to create a baked distressed metal map, I realized I needed to use a glossy, roughness, and normal image to get my baked result to actually look the same as my distressed metal procedural texture.

The issue I’m having is that when I import it into Unity, having multiple images to create this metal look gets really complicating. I’d rather just have one normal that looks like this to simplify things.

Right now I have my baked glossy image map connected to the bsdf glossy, roughness image map connected to roughness on the bsdf, and normal image map connected to the normal node, then to the normal input of the principled bsdf.

Is it possible within blender to make one normal map with these three images so after it’s finished baking, I’d only need to connect that one normal map to the principled bsdf to get the same visual result for distressed metal? If I didn’t explain correctly, I can upload my node setup when I have access to my computer later. Thanks in advance.

No. You cannot.

1 Like

The closest you’d get is a multi layerd EXR but that isn’t really what you’re asking. Simply put, the normal map occupies all 3 RGB channels, so there’s no “space” for any other info. However, what you COULD do is combine say roughness, metallic and height into one image, assigning a single channel for each, as they are all just greyscale. So Red channel for Metallic, green for roughness and blue for height.

2 Likes

You might try baking your Normal Map to a second UV channel. Add a new UV channel and then use a UV node to assign which texture is baked to what channel…Normally in the game world, we will bake the Normal map to the first channel and everything else to the second channel, and as @colkai suggested bake the Rough, Metal, Glossy into the RGB channels, using alpha also.

end up with something like this…this has the Normal and Metal and AO baked in one texture.

How you can use this in Unity I have little knowledge as I switched to Unreal a long time ago…

What on earth is this? What do you mean “channels”? Are you talking about combining two 8bit images into a single 16bit image (possible but why)? Or lowering the bit depth in a single 8bit image (sounds awful)? Or is the AO/Roughness parts not used on the same part of the mesh as the Normal map (sounds like a rare case)? Or is it a multi channel image (like photoshop or openexr multi)?

It’s better to keep normal map on it’s own texture, it’s a common practice.

Make a new texture with

  • R = roughness
  • G = Glossy (or metallic)
  • B = emissive
  • A = AO or anything you would need
1 Like

3rd option, use a multichannel image such as RGBA .dds files, that is what we use constantly on game files. It is limited as to what you can pack into an RGB. As @Ratchet also says it IS better to add the normal map in its own UV map. Normal will use UV 1 and the rest on UV 2, and yes they can be and are mapped to different areas of the mesh, in some cases. We have since dropped the multi-channel RGBA and moved to Diffuse>Specular>AO>Metalness and Normals … as the game engine is now pumping out 2k PBR real-time as can be seen in Battlefront II 2017…


and the asset in game…

Thanks so much for all the help! I have more than enough info to try one of these options.

end up with something like this…this has the Normal and Metal and AO baked in one texture.

Took me a second to figure out how you packed it. RG is normal (recomputing blue, I take it?) and the other two are on BA. I’m curious-- I’ve heard the alpha channel eats an extra draw call loading the texture again in UE4. Is this outdated or misleading info? I see it repeated, but haven’t seen it debunked or verified.

Edit: aaannnd, that’s not how you packed that? Ok, now I’m determined to figure it out :stuck_out_tongue:

Yes…
And I do believe it is correct, but not positive on the newer versions ( still use an older version ) as it has to recalculate the shaders each time, but I would have thought they would have it fixed it by now.

as far as figuring it…it would be far simpler to bake all your maps out of Blender, then pack them in an external program like Gimp or Krita and lastly Photoshop ( last cause it ain’t Free! )

For a fully PBR render process, you need
albedo + roughness + metalness + normal + emit + transparent
= 3 + 1 + 1 + 3(or 2) + 1 + 1 = 9~10 channels, you can get less than 8 channels if you dont need all of them.
A single RGBA image has 4 channels, each channel store 8 bits data. So you can use 2 images or use a 16bit image to store 8 channels, or you can use 4bit color if your game style is a bit pixelated.
Or simply, use a gif(each frame has a 256 colors limit) and store textures as image sequence.
Maybe we need make a custom image format for pbr workflow in the future