Why isn't this Normal Map showing up in render or on the mesh?

I’m trying to learn to bake normals from high poly meshes to use for detail on low poly meshes.

I’ve got a high poly mesh of a rock shape (purple material, 601K verts) and I made a low poly copy (grey material, 3.4K verts) using the voxel remesh. You can see that they are sitting directly on top of each other in the image with both grey and purple. I think my Normal image looks pretty good, representative of the high poly mesh. I’ve cranked Strength up to 3 in the Normal Map node to see if that makes a difference. I don’t see the Normal Map influencing the low poly mesh at all.

Any thoughts about what I can troubleshoot here? Thanks.

Blender 3.3
Windows 10

I think you need to select a UV Map on your Normal Map node, you have a blank field (the one with the dot) that needs more information for the node to work

1 Like

This was/is definitely the issue. Thank you so much. I see now I need to figure out how to get these two models to have matched UV maps after I’ve made the low poly copy. So I’m back on track trying to get where I’m going. Thanks again!

1 Like

You can try the Data Transfer modifier, or you can try Copying and Pasting the UV maps:

could You attach Your blend here?

if no UV is connected then, blender assumes its just UV, so there is no need of connecting it there

why would do want to have maching UV?


1 Like

I’m going to look into this, thank you.

Matching UVs… The two meshes have different structures (obviously?) and I think UV maps are linked to the positions of the vertices. So, for the Normal map from the high poly mesh to place those details in the right place on the low poly model, the low poly mesh’s UV map needs to be in the same place in the image space.

If I have a model with a face, and the mesh has the face UVs at the top of the map image, but the low poly mesh’s UV map has the back vertices in that spot in the image space, then the face details on the Normal image produced from the high poly mesh based on where it has those verts mapped, those face details, would show up on the back of the low poly model, because it’s basically laying down the Normal image over the UV map and putting the details where the low poly mesh UV map says they go. Did I explain that at all well?

No, UVs dont have to be synced.
Highpoly dont have to have UV at all.

I was in the car just thinking about this… If you’re baking “Selected to Active” then it should be applying itself to to the low poly mesh, in 3D space, so wherever it’s UVs are, then there… Okay, That makes sense. In this instance I’m not getting any results. Once I started trying to get UV maps involved at least I started to get some results, even thought they were wrong.

and if not, then there is no point of baking something into your highpoly ¯\_(ツ)_/¯.

I guess that you havent checked “selected to Active” and tried to bake hp selected.

Okay, now that I’ve got myself sorted out, I’m returning to self-report in case it is of use to some future searcher.

I had two main problems — a conceptual one, and then a very basic bit of operator error. Gorion, looking at your responses I think all of that is right? Anyway, Selected to Active hasn’t been the problem, I’ve been working with that (and extrusion and ray distance) from the beginning. My two main problems: the target LowPoly mesh needs to have its UVs mapped, and I think I was inconsistent in my early attempts, but more fundamentally, I simply wasn’t selecting correctly. To make an Active Selection it needs to be selected second WITH CTRL + LMB. Truly basic stuff. Being newer, I kept trying to select with Shift + LMB, so this process was never going to work, even with everything else in its right place.

Okay, so the best answer involved starting with a good low poly mesh, and going from there. The workflow that is working for me involves Blender and Instant Meshes. Blender Bones on YT had a video tutorial and Blender Knowledgebase (katsbits.com) had a page write-up on baking normals, and these two resources were the basis for my working answer.

Blender Bones
Blender Knowledgebase

I took notes on pretty much every step and setting. I’m leaving those notes here for future searchers. Sorry if it has mistakes.

High Poly (HP) Mesh to Low Poly (LP) Mesh and HP Detail to LP Normal Baking

Blender 3.3
Instant Meshes (v?)



Open HighPoly model.
With HP mesh selected: Item > Transform > Location > Lock XYZ.

Export HP obj file.
File > Export > Wavefront obj file.


Import HP obj file.
Open mesh > find HP obj file.

Make LowPoly mesh.
Remesh as: Quads (4/4)
Config details: Extrinsic
Target Vertex count: Set to target size (1-4K).
Orientation field: Solve > Comb until good outcome. (Navigate and Comb both use LMB, toggle Tool: Comb button.)
Position field: Solve > Edge Brush if necessary until good outcome.
Export Mesh >
Mesh settings: Pure quad mesh ON.
Smoothing iterations: 1.
Extract Mesh! If mesh has issues, return to Orientation field. If mesh is good, continue.

Export LP obj file.
Save > save LP obj file.


Import LP obj file.
File > Import > find LP obj file.
With LP model selected: Item > Transform > Location > Lock XYZ.
Make sure HP mesh and LP mesh are occupying same space.

If necessary, fix mesh.

Meshes seem to come in around 6-9K verts. If necessary, reduce mesh:
Make copy: Shift + D (keep/stay in locked position). Rename.
With LP COPY mesh selected: Object Data Properties tab > Remesh > Mode: Quad > QuadriFlow_Remesh: Use Mesh Symmetry; Mode: Faces; Number of Faces: Set target size (1-4K).



UnWrap LP Mesh UVs with Seams.
Layout Workspace > Edit Mode > Edge Select > Select seam edges (LMB select, Shift + LMB add/remove selection, Ctrl + LMB select shortest path) > RMB: Mark Seam.
When all seams are set: Select All.
UV (U) > Unwrap: Margin: 0.075. (ft.?)
Adjust UVs if necessary (UV Editing Workspace > UV Editor > Select Island > Split > Selection > move/rotate/scale).

Create Material nodes for LP mesh.
Shading Workspace. LP mesh Selected.
Material Properties tab > New Material (Principled BSDF). Rename.

Create Normal target image.
In Shader Editor.
Shift + A > Texture > Image Texture > New > Name file; WxH: 2048 px or 4096 px (should match image/diffuse image size if such exists); Alpha ON; Generated Type: Blank; 32-bit Float ON for Normal maps > OK. Image node > Color Space: Non-Color.

Set Up Bake.
Render Properties tab >
Scene > Render Engine: Cycles.
Sampling > Viewport > Max Samples: 8.
Sampling > Render: Max Samples: minimal effect on render time. 4096??
Lightpaths > Max Bounces > Total: 2.
Lightpaths > Caustics > Caustics: Reflective OFF, Refractive OFF.
Performance > Memory > Use Tiling ON > Tile Size: Same as Image/Normal image size.
Bake > Bake Type: Normal.
Bake > Selected to Active ON > Extrusion: 0.2 ft.; Max Ray Dist.: 0.2 ft. (Change these to fix Normals issues, .1-.4 ft.)
Bake > Output > Target: Image Textures; Clear Image ON.
Bake > Margin > 24 px.

Bake Normals.
Shading Workspace.
Normals target image node Selected in Shader Editor.
HP mesh Visible. HP mesh Selected first. Add LP mesh (also visible) to selection as Active Selection (must be selected second, Ctrl + LMB).
Save image.

Connect Normal Map to LP mesh.
Make Normal Map node.
Shift + A > Vector > Normal Map node.
Connect Normal Image Color OUT to Normal Map Color IN, then Normal Map Normal OUT to BSDF Normal IN.

To adjust Extrusion and Max Ray and re-bake Normal Map, disconnect Normal Map node from BSDF first. Obviously, reconnect to see Normals on mesh. Both HP and LP meshes must be visible for Bake to engage.