How to render normal pass with Eevee?

The normal pass is selected

image

But the result is not how a normal map normally looks like

I’ve tried a workaround that I’ve found online:

With 0.5, 0.5, 0.5 gray values for multiply and add. But as you see it is just a pale green now.

The MatCap normal shader worked well, but I would need to switch to Workbench every time.
Do you guys know a better solution?

It looks like your normal map is an RBGA .dds file ?? which is a game type format that packs the normal map into (if I remember right ) the green channel …if you split the channels in GIMP or Photoshop you can then just use as a normal.map with a non-color texture image…
I went through all my files ( mostly) and they are all changed already so I can’t show you.

Have you tried the vector transform node with normals as input? Of course you need to know what space you need to convert to, tangent, object, world space…

Yes, RGBA but as a png. So you mean switch some of the color channels and fill one of them with gray in PS? I can imagine that to look close to a good normal map, but there is still missing data. Some areas are still pitch black on the character.
I should be able to do the same thing in the Blender compositor instead of PS right?

@Xortag
I haven’t try that. I’m also new to Blender, I will just mess around with that node until something good happens, or nothing at all. : ) Thanks for the hint!

Actually it is a bit more complicated than that…I have not tried to duplicate it in Blender as yet…But the process involves splitting the rgba into channels. Then taking say, the Green channel and adding it to a new image same size and of a pure Blue, and add with a multiply filter…and if I remember it right that will produce your Normal map…It will sort-of work by using grayscale image but as you say it is incomplete and it will sometimes end up being partially transparent but if run through a bump map node then into Normal it might do…

I really haven’t used the compositor enough to say if it has the ability…but with nodes and the compositor I believe it should work…

I use either Gimp or Infraview to do the conversion…( I don’t have Photoshop)

Is your .png the same colors as displayed on the model? Sort of a yellowish with hints of blue here and there? I ask as all game engines do their magic in different ways…

If you have your Good diffuse / Albedo Map you could just run your mesh through XNormal or Materialize and produce all the maps that you need…might just save some time and effort.

1 Like

Not sure if this is what you want/need:

It’s a material shader that reneders all objects as a normal map in view-space so it can be used in a 2D-game

3 Likes

Thank you for your help @RSEhlers, but what @Norgrin suggested is better suitable for my needs.

@Norgrin, I did everything you’ve mentioned but the plane in the background. The result is different though. Even if I flip the red channel it still look wrong. Do you have any idea what is causing this? The normal shader is applied on her skirt.

Nevermind! The solution was to change the Combine XYZ values since my model orientation was different.

image

There is still one thing I want to know. How can I apply the normal shader on all objects of my character and switch back and forth with the original materials so I can render both, normal and the beauty pass?

Copy the normal shader in each material and switch the output as needed. Or save 2 different files.
Happy blending.

Wouldn’t that require to manually reconnect the nodes for all the materials that are applied to the character? In my case it would be like 20-30 materials.
Maybe the save file idea might be a good solution. I just don’t know how to practically implement that since I would need to import the animation from the other save file and recalculate the cloth simulation and other things. And if I do some changes to the character, I can simply import it over to the other file right?

thanks… was just getting to this, good tip.:+1::slightly_smiling_face:

The fact with close sim is that a simulation, physics, particles,etc… will not have same results on different machine or file. So you have to save the bake and reload it on the other file . And for the animation properly talking, you have to happen it in the other file and assign it to your armature via action editor.
So perhaps adding the shader inside the others should be better. And with a little python script it should be possible to make a one click switch.

1 Like