The normalize node can be very tricky to use effectively. Remember that it takes the entire range of values passed to it and then expands or compresses that range into values from 0 to 1.0. So the values being input are critical in determining the node’s output. The Z depth buffer has data that can change a lot very swiftly, even frame to fame, so that the range of values being input to the normalize node can also change swiftly.
Example: The z depth buffer shows fast-moving particles against a relatively static BG. The value range of the BG is pretty stable, but the values for the fast-moving particles may be changing over a large range very quickly. Assume that the BG has a value range in the Z buffer of 0.1 to 0.5 and that doesn’t change much frame to frame. Depending on its motion vector, a particle may have a range of near zero to near 1 over just a few frames. Many particles moving quickly contribute to the fast-changing range of values in the Z-buffer, so at any time the max range for the scene may change from say 0.1 - 0.7 to 0 - 1.0 to 0 - 0.5 from one frame to the next. The normalize nodes is taking all this changing data and making it fit in the range of 0 to 1.0, so some values (such as the relatively stable BG) are being shifted along with all the fast-moving data. Since this is being used to modulate a color property, you get the frame-to-frame color shifts you see.
I’ve used values for the Z-buffer that are between 1 & 0 for this example, but in actuality I think they have a much larger range, which will only make the problem worse.
Hope this make some sense.
One thing you might try instead of the normalize node is to use some Math nodes (the Minimum and Maximum functions) to clamp the values rather than normalize them. This can prevent value >1.0 and < 0, but otherwise does not change the entire range of values from the Z buffer the way the normalize node can.