Cycles Principled BSDF violating energy conservation?

Something was not behaving as expected, so i did a simple ‘white furnace test’

Fully white diffuse sphere in fully white environment. Setting the specular to anything above one results in a an image where the sphere is brighter than the background!
That should never happen, right?
Eevee is the same… so i guess it is either a flaw in the pbr definition or meant to be like that?

Setting a mix between a diffuse and a glossy does not lead to that result (but specular to zero and metallic only in principled is also fine…)

here the screenshot:

3 Likes

I did something similar a while back:

1 Like

what does it looks if you set the diffuse color to full black (0,0,0),and left the settings of the specular to 1?!

this way you can see how much environment gets reflected.

edit,maybe the reflection “layer” is added,that would explain the behavior.If the calculations are correct,then the Fresnel would reflect and transmitting the light.this way the transmitting light that hits the diffuse mat never would get full white.(like a fresnel glossy diffuse setup).and if a fresnel value higher than 1 is used of course.

iirc (T+R)=1 transmission+reflection
edit,(T+R+A)=1 is more correct, but in this case we dont need the absorbance
or 1-R=T
or 1-T=R

IIRC the specular/Fresnel term from the Disney/Pixar shader (which is the base for the blender shader) is a numerical approximation which skips a computationally expensive squareroot calculation. My guess is that this approximation is only correct in terms of energy conservation as long as the specular term is within the intended range from 0 % to 100 %.

That seems correct since in the Pixar shaders the value is determined by color and you can’t get more white than white which represents max specular.
I just fired up Maya with the latest Renderman 23 and did the test with the PixarSurface shader and it simple isn’t possible (or i don’t know how) to overcrank the specular value.

1 Like

Wow! I didn’t expect this! I gave it a little try and results are embarassing: totally wrong! I thought it was a problem of GGX distribution but it’s not. Try setting specular to 0 (so totally diffuse shader) and play with roughness. 0 gives darker fresnel rim, 1 goes “over-white” (in violation of energy cons.) and .5 is somewhat correct in the middle of 2 errors. I think the error lies in roughness code
What is going on?!??

edit: still playin’…
Setting roughness to .5 gives correct result, but if you introduce specular wrong result appears, so somehow the specular is added to the diffuse component. How can this be??? Is it intended in the Disney Principled shader or is it a trivial error in blender implementation?
Gosh! I think I’m going to resume my old custom nodegroups for shading! :confounded:

have not tested your testsenario,but what happens if you switch to GGX-Multiscatter?Multiscatter should not have this roughness energy conservation problems.

I fired up Arnold in 3ds Max 2018: no problem.

No matter what you set to the standard surface shader - it will never go over white. Only a small dip in metallness towards the edge, but this is known.

And @lsscpp Setting the Roughness on a fully diffuse makes it darker too, but never brighter… (in Arnold)

This starts to look more and more like a bug?

You will be more visible to developers if you discuss this here:

1 Like

In Cycles for Metallic values apparently it works correctly for Multiscatter GGX (Roughness = 0.5)

BTW if it’s not obvious make sure that tonemapping is turned off.

You wrote the same to @CarlG in his thread. He is absolutely right in saying this has nothing to do with it.

Tone Mapping can only scale and bend the resulting data, or better to say remap from one domain to another (raw to tonemapped), but it has no effect on the calculation of the data - where we see a problem here.

True, but it can exacerbate differences (or hide them). If you’re going to start talking about mathematical differences you have to remove the multipliers.

Manual says:

“To compute this value for a realistic material with a known index of refraction, you may use this special case of the Fresnel formula: specular=((ior−1)/(ior+1))^2/0.08”

Perhaps IOR should be changed too.

This is one reason why I still use the individual shader components, Principled might be great for beginners, but weaknesses such as this is why the proponents of making it the only real shader node are really jumping the gun.

1 Like

This formula is meant to choose the relevant specular parameter if you happen to know the IOR. However, if the result exceeds one (very highly refractive material) the shader wasn’t designed in a way to represent it. Specular in its original implementation should be clamped between 0.0 and 1.0.

The IOR parameter is not part of the original shader and only affects light ray samples which enter the volume in a (partially) transparent material.

While this is important knowledge for setting up a realistic shader it has no direct relationship with the issue of OP. The issue he describes is that more light leaves the material then hits it. This cannot happen for any physically correct material which does not act as a self emitting material (a light source). This fact is what energy conservation refers to.

I made some simple tests.i put a full white glossy shader to the material output.

with GGX and roughness 0 i get same white color with dark edges.
with roughness 1 i get a grey material color.

with GGX-multiscatter the color stays white.
additional with autosmooth 9.5° the dark edges goes away.

but with the principled shader and roughness 1 or 0 the bright white is still there even with GGX ,seem it is using GGX multiscatter too even if GGX is selected,or something other causeing this.

iirc there was a problem with the clearcoat layer in the principled shader?i dont know if the problem was fixed.maybe it causeing it?

here a problem with energy conservation with the clear coat

maybe the specular has a similar implementation?

Setting specular to more than one should give non-physical results since having it more than 1 is not physically possible.
This is similar to setting a color value to (2.0,2.0,2.0) for example which exceeds the white maximum of (1.0,1.0,1.0) and thus leads to something that is just unreal. (light reflected will be more than light falling, and thus light will be ‘emitted’ from the object)
I am not expert in how shaders work under the hood but I think this isn’t a bug or a problem in cycles since it actually gives you controls beyond physics, and for people interested in PBR only they shouldn’t exceed the allowed values.
So my question is how does this affect the PBR workflow for someone ?

yes and no.i think you are mixing things here.the specular value, in the principled shader, is not 1 if you set 1 into it.it is a artistic value,based on the formular posted before.

even 2 or a high value at 5 is ok too for metal for example.

here look at schlicks approximation middle of the site for dielectric.its a basic fresnel formular.

to get a more artistic value disney devided the result with /0.08

but…they are calculating the artistic value (in the principled ui) back to a fresnel value used in the shader

For mathematical differences you should examine the raw data by right clicking on the pixel and checking the non CM side. When things show up different with filmic, it means that the difference is pretty significant. If white furnace is done on sRGB, there wouldn’t be a difference to the eye (brightness 1 and 1.5 still maps to 1) making it less obvious unless you start tweak the exposure.

It has nothing to do with this issue. But no, I don’t think it should be changed. I use 0.5 for pretty much anything and tweak it artistically (sometimes with maps) when I need to - as intended. Having values in the 0-1 range in a linearly responsive fashion makes it immensely more easy to tweak procedurally than having to deal with non-linear IOR ranges.

It shouldn’t really, unless you’re into theoretical stuff. We’re supposed to work with realistic albedo values obtained from cheat sheets, kinda neglecting the whole thing.

I don’t know about the Blender implementation, but from what I’ve read, Disney’s original shader uses IOR also for diffusely reflected light rather than a purely Lambertian one. So light penetrates the surface, bounces around a bit, gets up to the surface (approximated of course), and then uses IOR at the boundary to determine which way light goes. Whereas with Lambertian diffuse (oren nayar is a different beast), it scatters fully random in all directions. That will lead to a darkening around the edges (unrelated to ggx multiscatter), which they then compensate for. Disney know it’s not conserving energy.