Rendering ZDepth Render Layer

Hello guys, hope its not a stupid question. but how can i render a zdept rander layer? if i enable “Z” its white.
i would like to render my picture with fog. but its take a long time to render so i would like to try to do it with the zdepth.

Is Zdepth white or the whole picture is white?

no the picture renders fine, but the “z” is white

That is normal, there are floating number values in a Zdepth image that you can’t see. If you want to visualize it add a normalize node and feed it to a viewer.

iam going to test it
you mean
z-normalize-viewer?

Z -> Normalize -> Viewer

perfect! thx mate

Lol. sorry, but how can i safe/export it? :slight_smile:

There is a file node, try that. You can also attach it to composite node and just render.


if i hit f12, blender safes only the zdepth in my folder.

sorry. its the first time i render somethin in blender with render layers :confused:

Select the file output node and open the n-panel. You should see a setting for adding extra image output sockets:
output%20node

Just hook the image output to the new output socket.

That and you could just hook the normalized depth pass to the z socket of the composite node then use the image editor to save the final render in an image format that supports it like openexr.

Edit: I just saw your other thread, and realized that it won’t save the normalized z-depth when you save using the image editor.

You’ll have to make sure the file output node is set to multilayer openexr and create an output socket on it for each pass you want. If the file output node is set to multilayer exr and you plug in all your passes (including your normalized depth pass), you should get a single openexr file that has the normalized depth pass.

1 Like

“Z-Depth” is actually a numeric data channel consisting of a floating-point number that describes distance from the lens. It isn’t intended to be directly visualized, although of course you can set up a node-network that will do that.

Commonly, Z-Depth would be used to feed a “blur” filter – probably using a Curves node to allow you to precisely control how the blur takes place as a function of distance. (It might also be “piped” to control other things, such as Hue/Saturation, and/or to inject a color-cast gradient.)