Exporting Into Blender: XNormal Creating Faulty Normal Maps

Hey guys

Ive been working on a model in ZBrush and finally have exported it to Blender.

However the normal map wasnt working correctly for the longest time and I tested it in multiple programs like
Marmoset and Maya.
Finally after much testing I found out it was the normal map that was the problem.

I ended up taking the screenshots in Maya(The normal map is making the exact same look in Blender)
and would like to ask you guys, what exactly am I doing wrong in XNormal?

The normal map keeps coming out faulty everytime.
Anyone have any ideas regarding my workflow Im using or tips on things to try?

When I apply the normal map it looks like this…

I can see that bits and pieces of the normal maps img are actually on different parts of the model…




Im using XNormal and I put the
high poly into the high def slot
and low poly into low def slot

For the smooth normals section Ive tried both
"Use Exported normals’
and
“Average Normals”
Selections

-When I choose exported or average normal I make sure to do the same selections
on both high and low def

For baking options I leave everything as default.
The only thing Ive changed is the resolution the second time around.

I made it 4096 x 4096

You’re clearly not using the same UV map for baking in xNormal as you are using for texture mapping in Maya. That’s not a problem with the normal map, more likely with how you set up your materials. Maybe this mesh (which I assume you have not modeled or unwrapped yourself) has multiple UV coordinate sets for different purposes?

Once you have fixed that issue, here’s some more information:

Getting tangent-space normal maps right requires that all applications use the same tangent basis. I’m not sure how to achieve this in Maya (you can probably import tangents that you export from Blender), but I believe Blender and xNormal use the same tangent basis calculation (mikktspace). There’s also a tool called Handplane that lets you convert object space normal maps to tangent space normal maps for various target applications. If you don’t plan on deforming this mesh, save yourself that trouble and just use an object-space normal map.

For more information: http://wiki.polycount.com/wiki/Normal_Map_Technical_Details

you need to use dithering/bleed when you bake the normals.
otherwise i think you didn’t check merge option when you used mirror modifier
don’t average your normals if you don’t have to, leave it be

and please, both xnormal and blender are just not good for baking anymore. they both exaggerate distance and proximity when baking. i suggest get something else

Check your UVs match up with the texture

Before you exported the normal map out of ZBrush did you flip it vertically ? The UVs will be inverted when you export the model so you need to do this so the UVs and the texture match up.

Moved from “General Forums > Blender and CG Discussions” to “Support > Materials and Textures”

Hi,

"Before you exported the normal map out of ZBrush did you flip it vertically"4
I did the normal map in XNormal

“The UVs will be inverted when you export the model so you need to do this so the UVs and the texture match up”

How can I invert the UVs?

Hi,

I modeled the mesh myself and I did a quick UV Unwrap in ZBrush using “UV Master”

How can I fix this issue that youre talking about?

Also, in setting up the materials I know for sure that normal maps work because I did a test with a brick normal and it was applied all over the model.

So Im pretty sure the problem is during the creation of the normal map in XNormal

But how can I do this “You’re clearly not using the same UV map for baking in xNormal”
that you speak of?

Heres what the UVs looks like…