saturated HDRI emission

Hello.

I compared my own renders with a tutorial (linked below) and found that my HDRIs emit too much colour when cranked up to 20, 30… at that point I expected light to be white. Is this an effect of 2.74, a hardware trait or can it be fixed with a setting?

Blend file: http://www.pasteall.org/blend/36539

Here the floor ends up very blue, but if I have one of my own environment maps with a lot of grass in them the entire ceiling becomes green every time, which is physically correct, but the saturation breaks the realism. Trying to desaturate has not yet given me any useful results either, but I would rather cure the cause than cover the symptoms.

http://www.pasteall.org/pic/89380

Comparison, skip to the 13th minute:

Tried and true on Geforce GT635 Mobile and Quadro K2000 with Blender 2.74 and 3 environment maps.

Please check your blend file before uploading !!

Texture missing

…and btw: If that file is in .jpg format, it is most likely no HDR image!

Yeah, HDRI format would be a start :slight_smile: In case this refers to a different image (like a background jpg, sorry, can’t test file right now), by looking at the file name, it tells me that this might be an outside image, perhaps with a sun? I have problems with these too, because they are rarely “fully defined”, that is, often the sun will still be “fully white” at lowest exposure. Where this happens varies, and “one solutions fits all” I think is impossible to come up with. So I made a set of scaler nodes where I can multiply the value (brightness, hue and saturation left untouched) of the image above an adjustable threshold. Seems to work for what I’m doing now. Might not work in all situations. Probably should not be used for indoor HDRI’s or when only the sky (no sun) is visible. It has a preview slider so you can isolate what area you are affecting. The top of image is just a node group to rotate the HDRI (I’m only using equirectangular ones).

Top part of the image just shows what it used in the Util.RotateZ.Group.

Usage:
Select your HDRI in the Environment Texture.
Rotate so you can see the sun.
Turn on Preview mode [1], and set multiply to 100 for now. Sun should turn black.
Adjust threshold until only the visible disc of the sun is covered (if possible).
Turn off Preview mode [0], and rotate texture until you get the rotation you actually want.
Scale down multiply value as needed, but should leave a bright highlight on a even just slightly shiny material.

The nodes setup here:


Oh, and yet another thing:
The missing .jpg file you use is named “360Panorama[…].jpg”, which seems to indicate an equirectangular image. But you have set the environment texture to use “Mirror Ball” projection…


Other than that it’s not surprising that a blue sky would create a blueish color cast in the render, because that’s what happens in real life, too. It’s just that our brain is so well trained about what’s supposed to look white, that it does an automatic and mostly perfect white balance on what we see with our eyes.

In modern cameras the blue tint caused by the sky light is dealt with by a (more or less successful) automatic white balance and before that, in the ages of analogue photography, color film material was also daylight balanced by just making it less susceptible to blue light.

Now, Cycles does (to my knowledge) not do any white balancing on the render, which means you have to do that manually afterwards (if needed) in an image editor of your choice.

…when cranked up to 20, 30… at that point I expected light to be white.
Why would a very bright blue light suddenly become white?

Thank you, but as I said I used three different envoronment maps and it didn’t change anything. I wanted to know whether the same happens to someone else, possibly without using the exact maps I did - because changing them didn’t do anything for me.

Okay, so it’s not 32 bit. I chose to use a jpg because of the following reasons:

  1. an EXR was 200MB+, it was slow and it didn’t seem to make a difference either way
  2. the tutorial uses jpg and the colour is fine!
  3. once the strength is raised to 20 or 50 or whatnot the colour sum from the sky as well as the grass is white in the viewport. The same thing happens when having overblown highlights in photography, so I hoped it would work the same way.

As for the mirror ball… that happened when experimenting, but I also noticed no visible difference between the two.

The whole point of an HDR image is to have a wide range of luminance levels, e. g. a very bright sun spot in an otherwise modestly lit scenery, which will give you a strong directional lighting quality for your render without having to increase the strength to insane levels.

Taking a LDR image and cranking up the brightness is absolutely not the same, as the brightness of that image is cranked up across the board. It is still an image with low dynamic range afterwards - only on a much higher level. And since you increase the brightness of sky and grass areas as well, it’s no wonder these colors heavily influence your scene. Even if you could crank the brightness up so much that you have all colours blown out to pure white: Why then use a texture as environment at all? In that case you could just use a plain white environment, as even the feeble directional lighting qualities of an LDRI will be overpowered by the general brightness…:eyebrowlift2:

There are HDRIs out there that are far from being 200+ MB in size. And if those are not high res enough to also be used as a background image, well, just use your high res jpg for the camera and the low res HDR for lighting. Or render with transparent background and add a backplate image in post…

From top to bottom:
a) HDRI (strength = 2) as environment texture. Light direction is perfectly clear (shadows).
b) Same image(!) as LDRI (strength = 4). Light almost completely diffuse, strong blue tint.
c) Bit of both: LDRI for the camera, HDRI for the rest


And another example: See how much “definition” the light loses by using an LDR version (bottom) of the very same HDR file (top)?