Looking for advice on fixing a lighting issue with large sized renders. I have simplified the scene below so it’s visible. There’s a single lamp in the middle of the room, distance 15, energy 0.8. I’ve used black lines to illustrate the issue – what seems to me to be a light falloff computation, circles of light moving out proportionally along the surfaces which appear too dark as gradients of the falloff.
Is there a way to get this light falloff more uniform? It’s visible in large renders more than small ones (you’ll need to view full sized image below to see it distinctly). I’ve tried all the different lamp falloff algorithms and none of them seem to work at larger resolutions.
Thanks – I’ll try your suggestions, but as far as I can tell, Blender only allows me to save a rendered image as JPG. I could animate one frame I guess, but what file format do you recommend?
Are you sure it’s my display? I am using a 1920x1200 wuxga, and the banding does change depending on the falloff option I use for the light, for instance, an inverse-square will produce a different compression of the bands.
The image looks fine without banding on my monitor. Blender does not produce banding and the jpeg format is perfectly suitable for storing image without noticeable banding. So the only other factors to consider is either your display card settings or your monitor. Check in your display settings that you are set for 32million colors. If so then the issue is with your monitor. I would have bet it is a cheap LDC monitor with only 6 bits of actual color resolution twisted nematics technology but based on your specs I doubt it. That said, It is not entirely impossible that a wuxga would still use that twisted nematics technology so… What brand and model is your display?
Note that if you change the light falloff characteristics, then it will necessarily change the band positions because the bands are a screen color space issue. Not a lighting issue.
My machine is a Dell 9100 laptop with 1920x1200 wuxga and ATI Mobility Radeon 9700 video card (128 meg). Color is set to “Highest (32 bits)” and everything on the ATI control panel is set to the highest possible quality, except AGP which I have dropped to 4x.
I have a few reasons I’m skeptical that it is a display issue:
I don’t see banding on other peoples’ posted renders, or in any other application, except, for instance, if I bring the Blender image into Gimp.
The banding only occurs in low-energy light situations, if I boost the light energy and distance up, the banding is gone, but of course that’s not the scene I want.
The banding almost entirely goes away if I set the lamp falloff to Linear/Quad Weighted, with the Linear and Quad values at 0. If I crank the quad value up to 1, the banding is back and particularly bad. Again, only for a low energy light.
Wouldn’t my lamp energy be irrelevant if it was a display issue? But it seems like my machine or something I am doing in some capacity can’t deal with the low-energy fall-off smoothly, so quanta-like strata appear as either reflections or specular or something on the walls at what seem like mathematically computed intervals.
OK. I see the banding. Barely distinguishable but they are there. I also see jpeg compression artefacts. Could it be because of the jpeg compression? Try outputing to png and see if you still get those banding.
I try to figure why it would matter if you change the attenuation function. Those values are computed with floating point colors so it shouldn’t matter. Post your sample blend project so we can take a look.
Sorry to revive an old issue, but a reasonable test for whether it is your monitor or not is to move the image window and see if the banding moves with it.
I found as others have that turning of any sort of OSA or FSA in the render helps with the banding, so you could try and render 2 versions - one with OSA and one without, and then combine them in an image editor or possibly even the node editor.