2.49b & 2.53 brightness changes when rendering animation

I’m using blender 2.49b, and i’m rendering an animation from frame 300 to 440. Some random frames have a drastic change of brightness when rendered. When re-rendered the problem continues…


The strangest thing is that when i change the number of AA samples and render that same frames it renders normally…


i am rendering the combined, Z & normal pass and using composite nodes to adjust the render afterwards, i’ve confirmed and the problem comes from the render, not the composite.

i’ve tried to render with 2.53 and the problem persists in the same exact frames.

(i’m sorry not to show the full render, but it is for work and until finished i’d like to avoid posting it on the web)

anyone knows what is happening and how can i solve it?

on closer inspection, there are more frames with brightness changes, i’ve missed them because the difference wasn’t so obvious… i’m getting a bit worried, :frowning:

i’ve rerendered the whole animation with 5aa samples and there are still brightness shifts… but at different frames… =(

At least you get a frame out sometimes 2.53 skips a frame for me (just black). But I havent had that issue for some time now (a couple months). Have you tried other releases?

It is hard to say without looking at the blend. You don’t have an animated light in your scene, do you?

Do you have places in the animation where faces are crossing each other or otherwise overlapping? I’m wondering if it’s a form of z-fighting.

I have animated objects particles and camera lens and position. The lights are static, i’m not using AO, there are no animation on the exposure value.
It’s the first time that this happened to me, i’ve been using blender for a few years now and i have no clue what is happening here… the strangest thing is the anti aliasing thing…

oh, and it ain’t z-fighting, the whole image is darked not just some objects…

Ok, aparantly the problem has something to do with the Zdepth pass, (i was using it to mix a color over the image)
i am using the normalize node to normalize the channel, but between frames exists a huge difference… Any ideas why?

it ain’t the z-depth pass… is the normalize node, i’ve confirmed it now. :\ a solution is not to use the normalize but the map to… but the result is a bit different (especially with a moving camera…) is this a bug or am i doing something wrong?

The normalize node can be very tricky to use effectively. Remember that it takes the entire range of values passed to it and then expands or compresses that range into values from 0 to 1.0. So the values being input are critical in determining the node’s output. The Z depth buffer has data that can change a lot very swiftly, even frame to fame, so that the range of values being input to the normalize node can also change swiftly.

Example: The z depth buffer shows fast-moving particles against a relatively static BG. The value range of the BG is pretty stable, but the values for the fast-moving particles may be changing over a large range very quickly. Assume that the BG has a value range in the Z buffer of 0.1 to 0.5 and that doesn’t change much frame to frame. Depending on its motion vector, a particle may have a range of near zero to near 1 over just a few frames. Many particles moving quickly contribute to the fast-changing range of values in the Z-buffer, so at any time the max range for the scene may change from say 0.1 - 0.7 to 0 - 1.0 to 0 - 0.5 from one frame to the next. The normalize nodes is taking all this changing data and making it fit in the range of 0 to 1.0, so some values (such as the relatively stable BG) are being shifted along with all the fast-moving data. Since this is being used to modulate a color property, you get the frame-to-frame color shifts you see.

I’ve used values for the Z-buffer that are between 1 & 0 for this example, but in actuality I think they have a much larger range, which will only make the problem worse.

Hope this make some sense.

One thing you might try instead of the normalize node is to use some Math nodes (the Minimum and Maximum functions) to clamp the values rather than normalize them. This can prevent value >1.0 and < 0, but otherwise does not change the entire range of values from the Z buffer the way the normalize node can.

Many thanks, i think i understand the the problem. Probably i’ll go with a “map to” node since is more predictable than the normalize.

Yep, I’ve seen Map To used to bring the Z-buffer values into a more manageable range, that’s probably a better way to go depending on what you’re trying to do.

The normalize node is very predictable as long as you understand how it actually works, but it’s the nature of the input values that has to be considered very carefully, otherwise it will do exactly what it’s supposed to, but not what you want it to do :wink: Dumb software, should be able to read minds, right? :smiley:


Dumb software, should be able to read minds, right?

lol, sometimes i wish he did