Can someone help me figure out what the problem may be. As you see normals are ok, but the problem persist, and due to this green/yellow “rectangle” (i guess) colors are displayed differently when I apply the normal map to a plane. (see last image)
Can you share your node setup?
Have you tried recalculating the normals on your original mesh?
So where did you plug in you normal map`?
I did, in the second picture is my original mesh with normals pointing the right direction, still not working.
how do you position the plane you’re baking to? Is it above, or below your high res mesh?
Does tangent space rely on UVs having been unwrapped?
Not on the high res mesh, no.
@visiorender: Iam not sure on what exactly your question is targeting at. But the result of having the normal map applied is correct. What normal mapping does is delivering normals, directional vectors for every point. And as the yellow is far off for the plane direction the light is darker there. Thats correct.
The problem is the normal map itself. That yellow quad is certainly wrong and the rest of the colors also dont look like the typical opengl normal map channel mapping. Even the front facing blue looks like its a bit off.
For any texture lookup a valid uv mapping has to be in place, so the model utilizing the map certainly has to be unwrapped.
Another problem that might or might not be present, is the angle. Only certain angles can be used for the normal map.
So my original mesh is made from a plane, which I added a solidifier modifier to and a bevel to define the edges. I want to bake normal, roughness and diffuse to another plane. The plane I want to bake everything to is placed in the “middle” of the high mesh. Original mesh is unwrapped. When I apply the diffuse and roughness map to another plane, everything shows the way it should, but when I add normal map, as It’s baking with this yellow rectangle in the lower side., the area where the rectangle is on normal map there is also a different color shown. I tried to bake in tangend space, and object space too, They both act the same, nothing changes. I tried both, baking “selected to active” and without. Not working
I thought blender defaults to generated coordinates if no UV is present
See how he has the plane image unattached on both? Maybe the problem is that the same texture is capturing data on both the low and high res mesh simultaneously. It’s kind of a weird stretch, but I’m otherwise drawing a blank.
I’m gonna whip up a scene, see if I can replicate his problem.
edit: that’s not it.
I can see the angle looks good.
It doesnt matter in principle. What you need is a mapping for the time when you wite the map and exactly the same mapping in the moment when your read it. But maybe I understood your question wrong, sorry if thats the case
It’s your solidify modifier that’s the problem. There’s no reason for you to have it on there, and it’s likely creating coplanar faces that are confusing the render, doubly so with your target plane being located smack in the middle of the high res.
Set it up more like this. Remove the solidify modifier, and put your target plane above your high res mesh.
My high resolution mesh is also a plane, I added a solidifier modifier to add thickness, and bevel to define edges, I can’t see how would it work without a solidifier to “extract” the normal map from the original mesh if there aren’t any height variations.
Bake it to a simple mesh by applying your modifiers and double check that the geometry is ok