A Few Beginner Texture/Node Qs

Hi All,

Have searched around and watched a number of tutorials but am a little confused about a few texture maps and whether different workflows are simply down to user preference or if there is a “right” and “wrong” way to link nodes.

I think I’ve more or less got my head around the concept of reflectivity/specular (amount of reflection) vs roughness/glossiness (sharpness of reflection) but I’m somewhat confused as to how/why so many workflows seem to ignore spec maps. Surely shiny tiles with dirty grout inbetween them would have different levels of reflectivity over the surface? Or are all non-metals literally that close in spec levels that it’s purely the roughness factor controlling the reflection appearance? I guess that makes life a lot easier, just seems somewhat counter intuitive to say that all non-metals basically reflect the same amount of light.

Similar to the above, some tutorials seem to advocate leaving the Specular slider at 0.5, others seem to adjust it up or down as they see fit with no real rhyme or reason, and others will actually plug in a spec map. If it’s true that non-metals more or less reflect the same amount of light, then why shouldn’t this slider always be set to that value? And secondly, I thought that in a metalness workflow the metal map was telling the shader exactly this - i.e. anything black is non-metal and gets a fixed reflectance value. What is the use of the specular slider then? It seems like two things trying to control the one characteristic (amount of reflection). And as far as I can tell, it’s not as if using one “disengages” the other… So basically just a little confused what I’m supposed to do with this slider.

Bonus question: Does leaving the Metallic slider set to 0.0, with no image input, just mean that it’s computed as pure black i.e. computed as the fixed reflectance value of 4% or whatever it is. Or do I actually have to get the plain black metalness map and plug it in every time?

Thanks very much!

Specularity is practically a legacy item these days. Most PBR setups don’t even call for it, relying exclusively on roughness/smoothness textures for surface shininess. You can still use it if you want to, you maybe even pull off a few neat effects through mixing and matching your shaders, but for a straight up regular PBR workflow, it’s ultimately redundant.

Now metallic textures are kinda weird, mostly differentiating the kind of reflectivity a surface will bounce back at you. From what I understand, which admittedly isn’t too much, the biggest basic difference between a dielectric reflection and a metallic one is that the former will always reflect white light back at you, while metals will reflect light back at you colored by the hue of their surface.

And yes, leaving the metallic slider at 0 means that it’s all colored black, and thus produces no metallic shine anywhere on your texture.

I can make sense of most of that, assuming there is an underlying assumption that all nonmetal materials have essentially the same fixed reflectivity value (4% or whatever it is). That assumption certainly makes things easier (and I think that’s what most people are working with even if initially it seems a little counter intuitive to me) so it’s pretty neat if true.

But if what you’ve said is true, then I don’t understand why the specularity slider exists at all, or at least why it can’t be “switched off”. Because if the metal slider is at 0, then the shader reads the material as non-metallic, and, as far as I understand, the shader then determines that the reflectivity of the material is the fixed 4% value. That’s cool with me. But then there is the specular slider sitting right there, and moving it around clearly affects the material, so it’s in someways now over riding the default (arguably superior) workflow of just having the reflectivity at a fixed value?

Is it just that leaving the specular slider at 0.5 means it’s “disengaged” and not having an influence? And in that case only lowering or increasing the value from 0.5 will start to adjust the default reflectivity set by the metalness? Like I mentioned earlier, it seems like there are two sliders trying to do the same thing, and so I’m just not sure how the shader is differentiating them or what the best way to “ignore” the specularity input is.

I’m looking into it right now, but I might’ve been wrong about how Blender handles these textures. I’m looking at things from a Substance Painter/UE4 perspective, which treat spec and roughness maps as two distinct, separate things. You’re working with either/or, not both together. The principled shader looks like it might use the specs for shininess, and the roughness for gloss.

…if this is true, it’d go a long ways towards explaining why I can’t seem to get my imported textures looking quite right in Blender. I gotta figure out what the hell’s going on here.

Found this on Right-Click Select.

That link was a great find, thanks for sharing, it clears up a lot.

“To be precise, moving the value of the specular socket in the [0, 1] range will give you 0-8% specular reflection, which is the typical range for most non-metals. The default value of 0.5 will give you 4% specular reflection.”

In which case I’ll stick to base colour, roughness and normal, and just leave that Specular slider at 0.5 by default knowing that it means it’s applying the 4% spec reflection value to all non metal materials.

This brings me the next series of my beginner texture/node questions: normal maps

There are Blender vector nodes for “Bump”, “Normal”, and “Normal Map” and I’m not 100% why, but as far as I can tell it’s never enough to just plug an image texture directly into the Normal slot, it instead has to go via one of these vector nodes.

I’m not entirely sure which one I should be using or why again. The easiest method seems to just be plugging the purple normal texture directly into the “Normal Map” node and then just plugging this straight into the Normal slot of the shader. Seems simple enough. Not sure what it’s doing but Node Wrangler shows it’s obv changed something.

Which leads me to believe that the “Bump” node is more for the older greyscale bump maps? Which then get plugged into the “Height” slot of the node, and the Normal output goes to the shader as expected. A few things that puzzle me with this though:

  • I swear I see people plugging purple normal maps into the Bump node, what’s the rationale behind this?

  • The Bump node actually has an input for “Normal”… this is rather confusing. If you already had something outputting a normal texture, then why would it get plugged into the Bump node rather than directly into the shader Normal input?

And not sure when I should use the “Normal” node with the spheres?

Not confusing at all. I use it all the time to chain additional bump effects onto either normal maps or other means of normal modification. Could you add bump heights together before feeding it into bump mapping node? Most likely, yes, and it might even be more efficient to render. But in some cases you only want part of the main bump to effect the coating, and additional bumps only on the coating.

As for spec maps, I mostly use it for shadow gaps (but procedurally). Think spaced wooden floor boards with dark cracks between them, but you do it on a plane rather than model up the whole thing. If you only increase roughness you’d still get a very rough specular shine from it.


Using spec maps or not also depends on you being able to audit proper roughness maps. Rarely do I find specific roughness maps in a pbr material, it’s usually a greyscale modified diffuse from some automatic process. If you have a spec map, plug it into a ramp and see if you can make the material better with it - experiment. Making it look good is more important than some simplified theoretical approach like pbr which doesn’t account for too much.

Traditional specular maps were used for specular color (incl. brightness) and I think you might be getting confused by the fact that the principled specular slider does not work the same way. (Especially considering how many different ways “specular map” is already misused, they really shouldn’t have used this term for the shader input.)

Principled specular input is much more akin to IOR-- it controls the Fresnel. So, yeah, increase the IOR and you get more specular, but that’s not the same thing as a brighter specular color.

In reality, my understanding is, yes, most non-metals have the same-ish IOR, or not different enough to notice, which is why principled’s default 0.5 specular is appropriate for most stuff (and why it was chosen as the 0.5 level.) One goal of PBR shaders like the principled was that you could just plug real-world values into a mat and it would look right, under any lighting conditions, from any camera angle.

But tweak to your liking. PBR might be physically based, but until we have the budget for many trillions of verts (ie, not in my lifetime), it is not remotely realistic, and sometimes it demands compensation. In particular, models with less detailed geometry or weak normal maps usually end up with bad specular at 0.5. Or, sometimes, you’re using to make something that’s not supposed to be realistic anyways.

If they’re doing that, they’re doing it wrong. There is no rationale.

Beg to differ! :stuck_out_tongue: Thanks for your post though, def improving my knowledge.

So in this instance you’re using the Bump node to allow you to combine the effects of both a greyscale bump texture (plugged into height) and a purple normal map (plugged into normal).

The general rule of thumb is generally that greyscale bump maps get plugged into the height input of the Bump node, and purple coloured normal maps get plugged into the colour input of the Normal Map node? (And for my relative beginner understanding I can more or less ignore the Normal node with the two spheres?)

That makes a lot of sense, and I’m assuming when we plug this spec map in via a ramp what we’re trying to achieve, generally speaking, is 0 spec in the deep shadows and then get the rest of the material to read as the 0.5/4% spec default? almost the same way a metalness map distinguishes metal non/metal, except we’d be telling the shader to distinguish between 0% spec ref and 4%.

def agree, just want to make sure i’m starting from a solid theoretical foundation because then i can understand why/when/how to break “the rules” and why it might look better one way vs another. thanks again :slight_smile:

and super cool wood material if that’s yours, inspiring stuff!

I think I probably was too, the link that Renzatic found cleared a lot up for me, as well as what’s been said in this thread, many thanks :slight_smile:

That’s super valuable information, makes perfect sense, probably would have taken me a little while to fully understand and appreciate it by myself without your help so thanks for accelerating my learning :slight_smile:

Yes. And the normal map blue output can be connected to the normal blue input of the bump node (which also has a height). Note that the normal map texture can’t be rotated by a mapping node, but the bump map texture can. I can’t stand texture repeats, so I typically don’t use normal maps where the normal actually matters (like bevels on a tile). But might be okay if it only reflects some random surface normal (like patterns on wood). But I will often generate "random lay inaccuracy) on tiles simply by changing some random color output into a random normal (effect decreases near bevel).

Correct. If the shadows aren’t “that deep”, you can also use the AO map (multipled with .5) to drive the spec. The lowered specularity can really make a visual impact.

No, not mine. Probably not rendered. I just googled floor board and this one showed the crack :smiley: Sorry, should have mentioned it. I do have floorboards that does this, but I couldn’t access and check anything at the time (was rendering out something).

Awesome, thanks for all the help! :slight_smile: This thread has been really useful already but if I can just touch on one more topic before considering it done and dusted; I’m assuming displacing the mesh should really only be done via the displacement node and I also assume that a greyscale height map or displacement map should be used as the input and plugged into “Height”. But I notice that there is an input for “normal” as well, and I can’t say I’ve seen too many of the purple coloured normal maps that have been named as a height/displacement map.

I tried plugging a few normal maps in there and see some results, but there’s so many different ways to plug things around that I’m just after some guidelines on what the default best practice is.

So firstly, assuming we have access to a greyscale displacement map, does it get set to non-color and plugged directly into the “Height” input of the displacement? Or is there benefit to plugging it into the Height input of a Bump node and then using the Normal output of the bump node and plugging that into the Normal input of the displacement? I can even go via the Normal map node…? So basically which of the following pathways is correct:


I suspect/hope that the top path is the correct one, the other ways seem unnecessarily complicated and potentially flat out wrong, is there any reason to go those routes?

Secondly, assuming we don’t have access to an actual displacement map, I assume we can just try to use a bump map, or even the colour map or an AO map (or the normal map?) or some other map as the displacement and see what happens. The results might not be accurate or desirable, but on the off chance that it does look alright, is there any additional nodes that need to be added to “convert” the maps into something more useable? Or it’s just plug them straight into the “Height” input of the Displacement node?

And just as a bonus round, why would you even want the material Displacement settings on Cycles not set to “Displacement and Bump”??

It seems like it should always be on, and if you don’t want to use actual displacement then just don’t put anything into the Displacement Input of the material?? But by default it’s set to “Bump Only” and it’s kind of annoying having to plug in a displacement map, and then also have to go and turn the displacement setting on. If I wanted “Bump Only” then I just wouldn’t use a displacement map… Am I missing something here?

Many thanks again!

Probably not. I don’t remember, but I’m guessing all textures are set to color by default if you simply add the texture node - how is Blender to know what goes in there and how you’re going to use it?

Set to non color and plug in manually. As for the other “paths”, I’ve never used the normal input for regular displacement. Maybe there are cases where you’d want to modify the normal as well as displace the geometry - but I think I might use modified normals at shader level. I.e. you could use displacement for grout/tile height, but use normals to simulate inaccurate lay angle of the tile. But I haven’t tested.

Nobody is going to stop you :slight_smile: But I wouldn’t use normal maps. And it’s not going to look right. Don’t forget you can always make a displacement map procedurally as well. However, displacement is massively expensive - I’m more likely to ignore displacement maps as displacement rather than adding them.

I prefer displacement only. If I’m using bump or normals I do it on shader level. So far I haven’t touched vector displacement. And I’m always using adaptive subdiv with default dicing rate (experimental feature enabled). I’m likely to do deep offsets with displacement, lay inaccuracy (tile work) with normals, and tiny details as bump. Displacement will affect the full geometry, normal and bump can be limited to specific shaders. Be very picky on where you use displacement due to the heavy cost.

That said, I’m usually just playing with it. I can’t be bothered using it for production scenes which are big enough already, and I need output fast.

Displacement map->displacement node->displacement output is corrrect.

Because the displacement map displaces in the direction of the normal, and has the opportunity to output normals, it needs to know what your normals are. If you don’t plug anything into here, it will use your base, unmodified normals. If you want, you can layer multiple maps using the techniques you’re showing in your second and third pathways, although there are a number of reasons those exact pathways aren’t really right. It’s advanced stuff, and you don’t really have to worry about it, but you can chain multiple normal maps/bump maps by running their normals serially into each other.

There isn’t really a difference between a bump and a displacement map, other than intended use-- they both represent per-texel height data, measured from the surface of the face. You can of course use anything you want to make an el cheapo bump map-- something, anything, is usually better than nothing. Running it through an RGB curves node is often a good idea for tuning if you want to do that; blurring in an image editing application is often a good idea for that as well (nodes don’t do blurring well, and the typical randomize-UV technique for blurring image textures is not a good way to blur bump maps.)

If you go with displacement + bump and don’t plug anything into displacement, you don’t get bump mapping for your normals-- you have to use bump nodes manually and run them to all of your shaders, to everything like Fresnel that uses normal. The use of the displacement output for bump only can save you some time. (There are times, as CarlG mentioned, when you want different normals for different nodes, though.)