So I rendered the Cycles > Camera Data > View Vector output by connecting it to an Emission shader and want to understand:
- What do the RGB values actually / precisely represent?
- If rotations wouldn’t R and G alone suffice to cover a quarter-sphere (or less)?
- Why is B seemingly inverted (overall, not from the posted sample)?
I was thinking that they might be rotations about the Camera origin / pinhole / point that would point at the sampled location in a scene. However when I tried rotating an edge by the floats sampled from a corner of the image the edge didn’t quite align to the same corner.
The edge starts from the Camera orign and extends through the exact center of the view plane. From there the edge is rotated about the Camera origin. Values tried were from a 90 degree FOV (or 16mm) 129x129 pixel render. Tried two bpy methods both with slightly different results again:
import bpy bpy.context.object.rotation_euler = -0.57585 bpy.context.object.rotation_euler = 00.57585 bpy.context.object.rotation_euler = 00.58035
import bpy bpy.ops.transform.rotate(value=1.0, axis=(-0.57585, 0.57585, 0.58035), constraint_axis=(False, False, False), constraint_orientation='GLOBAL', mirror=False, proportional='DISABLED', proportional_edit_falloff='SMOOTH', proportional_size=1)