Normal Map from B/W Bump


There are lots of ancient topics on this, so I don’t want to duplicate of course. But I can’t seem to find a straight answer, particularly in relation to the latest release of Blender. I tried the below to no avail.

I need a normal because I’m making glTF, so there isn’t a slot for bump.

I tried baking the emit of the normal map node in the shader editor, but this just doesn’t seem right to me. Below is the result of that baked emit. the object has UV mapping, which I think the bake of the emit doesn’t take into account.


AT the moment my workaround is taking a black and white image into Photoshop, but surely this must be a standard thing people need to do in Blender? Create a black and white mask texture with nodes and then be able to get a normal map from that.

I can’t share the blend file, the object is just an insole. So any simple object, with UVs and a shader with black and white texture.

The vector that the Bump node outputs, is in World space. Normal maps, on the other hand, are in Tangent space, with shifting and scaling to fit in the RGB spectrum.
To do that conversion, you need something like this:

1 Like

Thank you for this. Could you possibly update the instructions for 2.83+ ? Have any improvements/simplifications been made since?

looks like the link for nodes set up is dead ?

was this a free addon ?

happy bl

One of my addons has this functionality (normals from height)

1 Like

Would it be possible to generate a normal map from a bump map, without baking ?

I mean, if photoshop can do it, would it be possible directly in the Shader Editor without baking ?

Yeah, it’s totally possible, it’s just math. You can do pretty much any mathematical transformations in the Shader Editor. I think it hasn’t been explored a lot since you can do it so easily in Photoshop or on one of the many online convertors, but if you want to do in the Shader Editor, here’s the math (in JavaScript form):

It’s nothing too crazy, you’ll need some Power nodes, Greater Than nodes, Less Than nodes, Addition, Division, and Multiplication, at least that’s what I got from skimming that :slight_smile:

1 Like

Thx !

Just wish I knew how to read code :sweat_smile:


1 Like

That does complicate things slightly :sweat_smile: I’ll translate that into shader nodes at some point, it might be a couple weeks since I’ve got a lot on my plate, but I’ll get to it unless someone else comes along to this thread and does it first

1 Like

Ohh thx, that would be usefull to us, humble Blender users :slight_smile:

1 Like

Hi there,
I may misunderstand you plans, but if you plug your Bump node into the Principled normal socket, and you bake a normal map, it should simply be it.

Since the BW texture contains less data, so should your normal map, but technically it would be a normal map.

To have a fancy generated normal map, you would need to bake a high poly mesh, where the real displaced geometry will become the normal map, with/without anything going into the Principled normal socket.

PS: just make sure the image texture for the baking is set to non colour and so is the end result, when you are going to use it. Also you may want to use a 32 bit depth image

1 Like

But what’s the point? If photoshop can do it and you need it as a normal map, then just use photoshop? The normal map don’t get any better resolution than the bump map already has, so if staying within Blender isn’t just using the bump map directly more suitable? Normal map can store infinite angle over an area as a single color, whereas a bumpmap would need a gradient (possibly 16 bit) to do the same thing - but here you’re limited to the bumpmap anyway. Bumpmaps are slightly slower than normal maps in that it has to calculate the normals based on the height info, whereas a normal map only has to lookup the normal information directly. Normal maps are also dependent on the UV layout and thus have limited use-case. I actually prefer bumpmaps if I can get away with it since I’m staying within Blender, despite slightly longer rendertimes.

1 Like

So what you are saying is that a Bump map converted to a Normal map wouldn’t look any better ?

My idea was that people converted bump to normal to get more realistic shading

I have never use photoshop.

What do you mean, if UVs layout change, the normal map won’t work anymore ?

No? Normal maps, being image textures, can use generated coordinates, object coordinates, world coordinates, UV coordinates, camera coordinates… you can even plug your own custom vector into an image texture and use that as the coordinates

1 Like


Seem height map can be use to convert to Normal,

Although I don’t know if we would get better shading than just using our bump map ?

Or is it just for faster render time and convenience to export to 3D Engines ?

Ehm, not what I meant. Bump maps also need coordinates of some sort. But you can’t manipulate UVs once normal map has been drawn or the angles it’s supposed to result will change. A bump map doesn’t care, because the normal is calculated on the fly. See post here for example of baked discontinuous normal map producing no seams - the same can happen in reverse.

1 Like

Not just. As I mentioned, for large areas some normal modification can be represented with a single color in a normal map, where in a bump map it would require a high bit depth gradient. I.e an inaccurate laid brick. Normal 0.5, 0.5, 1 would mean perfectly flat, just going 0.6, 0,5, 1 would tilt the whole area with that single color. In a bumpmap you’d need a gradient to represent that as heights (from which normals are computed at rendertime). Which needs high resolution and/or high bit depth.

They both definitely have their uses. Just don’t assume normal maps are somehow magically better in all cases and don’t have its own quirks to deal with.

Frankly they both kind of suck. Before path tracing - say back in the 90s :smiley: - we were able to distort the “shadow channel” using normal modifications. Now all shadow calculations remain perfectly flat and are highly unrealistic. I’d use microdisplacement if I can afford it.

1 Like

I see !

Thx a lot for your input it helped me to understand the difference and nuances better :slight_smile:

…I play video game from time-to-time…

Yea I suppose POM or even tesselation are better alternative to Normal map

Or as you said microdisplacement.

In any case, I wonder how much we will need those in the future with advancememt of tech like Nanite in Unreal who can display millions of poly real time !

On topic of bump shadows, can it be done using nodes in Eevee, since Eevee is a rasteriser?

Most likely not. This trick was from an old recursive raytracer (Realsoft3D) that didn’t involve “enclosures” used in modern path tracers. Meaning it wasn’t a rasteriser, but had deeper access to light calculations than Cycles. It was only a hack, but it sure improved the looks of shadows.