Baking normal map issue

So I’ve been trying to use the technique of baking a high-quality model as a texture to use as a normal map and applying it to low-poly model. However, when I try to go through the baking process, I seem to get some odd results. I end up with a purplish texture that appears to be rather low-quality, and not the high-quality texture I’m trying to get.

Also, based on what I can tell, it seems wherever the mesh seems to concave a bit (like say, the eye socket), it seems to be covered in random rainbow colours. When I decided to try and apply this texture to see if I was actually wrong and it was high-quality and I just wasn’t seeing it, no extra details or anything from the high-quality texture was added, just looks like the low-poly mesh.

Please add pictures and screen shots of what is happening as well as your bake settings - sounds like a few different things, can’t tell until we see something.

made this really quick for you

Alright well, my bake settings are:

Bake Type: Normal

Margin: 8 px

Selected to active is checked

both clear and cage are unchecked

Ray Distance is 0.800 px

The normal settings are

Space: Tangent

Swizzle: +X, +Y, +Z

I had to apply the images as attachments, as “insert image” wont seem to insert the images, but anyway. When I tried putting the baked normal map on, Blender for some reason seems to have applied the rainbow colours I mentioned earlier all across the mesh, and I have no idea why it’s doing this now but didn’t do this yesterday.I have a picture of the mesh in rendered view and it’s nodes (Was doubtful it would help, but thought if there was the slightest chance…), and one of the high-quality mesh, I also have a picture of the baked normal map, but I’m having trouble getting it to attach so it might be a little while before I can get one up here. I’m not sure if I need to add any more, so let me know if I do.


never put a purple dot into a yellow dot in the node editor.
move the normal down to the normal input on the diffuse shader.

let me know if you need an explanation on how to set it up and ill make a video response.

Thank you finalbarrage, I hadn’t noticed that I put the purple dot in the yellow dot at all, fixed that. Also, thank you for linking what you did earlier, I’m watching that now while I have the time.

Also, I have a picture of the normal map now as well as a better picture of the low-res mesh in rendered mode after applying the normal map, so here they are:

uncheck “selected to active” i dont know what it does, but ive not figured it out. if you know how it works, sure, use it.

also, under the normal settings, you have “tangent”, set that to “object”

Selected to Active does exactly what it says: If you had two models LP and HP it would bake a normalmap which would take into account differences between two objects. You’re definitely interested in Tangent space normalmap, not the Object space one. Read about differences here at “Object space versus tangent space in the context of normal maps”

Selected to Active would need two meshes, one which you have sculpted- detailed, another - low poly model you want to use normalmap to imitate small details. Small, not huge. There’s always a limit of what normalmap can fake.

Normally, image type for the normalmap is 8-bit sRGB, your’s does look like 16-bit linear.

Your UVs are not optimal at all and you can clearly see that distortions fall on the smallest UV islands - if bleed Margin value is 8px you probably see this part as a distortion while actual island takes much smaller space.

Normalmap Image node should be set to show Non Color Data, not Color.

ahh that makes sence, i didnt quite understand the bake settings then. but a normal map is not non color data is it? :o

Honestly I’m still rather new to this and normal maps, as well as the entire idea of render baking. I’ve been using a tutorial series to help me understand a few things. I don’t particularly have an immediate need to use this technique, but I imagine I may have a use for it somewhere down the line. Anyway, your explanation, eppo, has really confused me here.

Baking - if rendering image is more or less taking a photo from one side baking ‘takes image’ all around the object in photo, like these whizzing green rings going up and down the hero in sci-fi movies. Image taken then can be wrapped onto the similar object however to do this you need a way to tell how and where this image needs to be aligned - you need UV coordinates. If real cameras can use different filters to capture certain range of data baking can gather different information too - data about surface irregularities - normal maps, colr - diffuse maps, Scene lighting falling on object - lightmaps and so on.

Normalmaps encode surface irregularities into image using 3 light components Red,Green, Blue and their respective proportions characterize surface normal’s vector at given point not telling the actual height of this point.

Since Blender internally does all kind of RGB magic (gamma corrections forth and back, namely) for images it is important to tell it should not do this on normalmap data which is done by setting Non Color on Image node.

Aha! That did it, and I also seem to have gotten Blender to give me a normal map with all the detail on it.

I am still a bit confused as to how this all works, so I guess I’ll probably have to look for some tutorials on just this or read about it in the manual etc. But I can’t thank you enough, eppo and finalbarrage, you both helped immensely.