Lately im building a composition in fusion in which I want to combine blender cycles passes with terragen passes. Mainly the data passes. Normal, world position, z, vectors. Most of them are easy its like inverting blue or red but the main problem is Z depth.
Now z depth in cycles is distance based. What I mean is that it makes a curve around camera.
In Terragen is planar. Its a straight parallel from camera.
All mist passes both eevee and cycles are distance.
In Eevee Z is planar to my surpise.
And the most confusing thing is that building a override material to make a planar z for cycles requires a camera data node that outputs the planar z depth through the view Z depth connection.
Anyway the real problem is that I cant rerender 1200 frames for a planar z depth.
I managed to find a solution by rendering a sphere with the same camera I used to the center in eevee, then dividing all by max z value the multiplying the result back with the z pass from cycles to straight the curve. The result is good but when u zoom in in pixel level you see artifacts here and there.
Is there a more precise method to convert z depth distance to planar based on camera sensor/lens?
It should be rather easy to do with some simple math and as it is done per pixel, therw should be no artifacts. Unit circle calculations will help here: https://en.m.wikipedia.org/wiki/Trigonometric_functions
First you must calculate the angle from straight direction for certain pixel, then calculate the ratio of tan and sin for that angle and multiply z depth by this ratio to extend it to straight depth plane. Or something like this…