ShaderToRGB can't feed into Bump? (EEVEE)

I’m trying to get a bump to distort a normal map.

I was expecting feeding the ShaderToRGB node into the Bump node would result in a bump in the resulting normal map.

I do seem to get something very small out but it seems to be nonsense.

Is this something to do with render order or something?

Normal Error.blend (666 KB)

It appears that the bump node does not like working with the output of ShaderToRGB nodes! Not sure why. Whatever you were planning on doing, do it a different way.

Thanks I can’t find this problem listed anywhere. Can I at least file a bug report?

Yeah, do it! I’ve posted nearly 100 bug reports and more than half of them have been acknowledged and fixed.

Can you even use just a bump to material output? This shader setup makes no sense to me?

1 Like

I’m with @BigBlend
This doesn’t work. What are you trying to do here?

I set this up as a minimum viable example of my issue, and yes material outputs can accept color values without any bsdf.

I am trying to get this 2d lighting effect to work how I want.

Right now it is distorting the UV map using the position of the light instead of the actual illumination on the plane.

The bump node is one of the few nodes that sample more than one pixel so it may just be impossible to get it to work how I want.

Could I get a link to where I should submit this bug?

I’m sure that the Material Output can output a color from a color node, however it is more of a fail safe than a practical way for it to render.

You can make that effect by using regular shaders. Making a bug report would waste unnecessary time for devs.

go to and follow the bug report instructions.

I’m kinda confused about the workflow here, however. Why use lighting to create a bump map in the first place? Bump is only visible when you have a point or directional light to bring out the details on a surface. But if all the lights in your scene are also influencing the bump… it just seems like a headache. There’s no way in the shader nodes to delimit which light source influences the shader going into the bump and then which light source reflects off of that resulting bump.

There has to be a better way of doing this. Why not map a circular gradient texture to an empty and use that? I’m still not sure why it has to be a light that generates the bump.

I think material output detects that it doesn’t have a shader connected to it, so it adds an emission shader internally.

I can see no problem here. I tested the ShaderToRGB Node as height input for the bump node and it works just fine.

But try to use that resulting bump as a normal modification on a glossy shader or something. I have a feeling it has to do with the internal signal routing. Similar to how not all signals are updated and accessible when using displacement in Cycles (although that one is more obvious).

Left aside that the example is quite strange, then the exact problem is what?

I can put the shadertorgb into the height in the bump. I can feed the bump node with normals and height at the same time and principally it mixes them.I can also use the result as normal in a glossy or diffuse shader.

Weird. I’m getting nothing. Used on sharp glossy, reflecting the environment. OS specific bug?

Hmmmmm. Really nothing? Diffuse BSDF in this case.

Perhaps you are all facing the same problem. By letting the bumptexture run through the principled bsdf like it was setup in the example leads to its attenuation over by distance as the lamp has an influence of what arrives in the bump node. But the the normal map is feeded directly into the bump node, so its input won’t fade by light distance.

But as I said the original setup is really not good.


Changing the light type to eg sun is a quick hacky fix, but in the end it is the setup that has inherent problems

Maybe I was trying the wrong thing. My attempt was make the surface shiny, reflections appear on it, therefore whatever was reflected (the hdri world) would create bumps shown in the next shader. I didn’t have a normal map attached, just a flat shiny floor reflecting the hdri (which is infinitely far away).

Hmm yes maybe. I tried to stick to the original setup. And there the shadertorgb can be put into the bump node and works just fine, it just has the mentioned drawbacks.

Okay, let me explain my use case a little bit, I recognize it’s a little unusual perhaps there is a trivial way I’m missing to get what I want.

The characters will be animated in toonboom harmony and exported as an image sequence with an alpha channel, they will then be placed into the scene on a surface.

I would like the lights in the scene to light the characters in a pleasing way, ie not like they are just on a plane.

When I shine a light on the plane I would like to create a rim lighting around the characters wrapping around the alpha channel on the image.

The way I normally do this with after effects is by coloring the character in black and compositing it with a solid color color card for the light and offsetting it then adding it to the original.

It produces a nice effect and is very easy.

What I am trying to do in blender is control the “offset” of the UV map based on the amount of light falling on the surface.

So if I had a surface that was illuminated like this

I would expect feeding the ShaderToRGB into the bump node would produce something like this:

I could then subtract this from the UV map of the character to produce something that is distorted “away” from the brightest light.

That I could mix with the original image

to produce something like this

Notice the nice taper on the arm that is only partly in the light

However, when I actualy do it,


I just get Blue.

Well, ALMOST blue If I mess around with it I can get some artifacts to show up, so it’s doing /something/ just not what I expect.

This is certainly undocumented behavior and I will file a bug report on it.

Just a case of a missing shader to plug the bump into is all =)

1 Like