One of the things I do at my job is creating illusions. I use Blender a lot to R&D the designs/builds.
My next project’s scope is pretty big, so I measured out the physical space we’ll be building in and created a virtual representation of it. The intention here, of course, is to be able to precisely work out the placement of the camera and rely on that to design what we need to build.
The camera I’m using is Canon 5D Mark IV. It’s got a full-frame sensor (36 x 24 mm), but it apparently has a crop factor of 1.74x when shooting in 4K. The 4K cropped sensor size doesn’t seem to be listed anywhere, so I looked into figuring this out.
If what I understood is correct, to get the 4K sensor size, I need to divide the diagonal value of the full-frame sensor (43.27mm) by the crop factor:
43.27/1.74 = 24.8678
I punched in the resulting value as the sensor size, and set the focal length to match the lens on the camera (16mm).
But this is where I run into the issue.
What Blender’s camera sees and what the physical camera sees are different. Blender’s view is significantly wider. I tried manually dialing down the sensor size, and it appears to match the real camera at 20mm.
Is my math wrong? Or is Blender’s representation of camera values not reliably accurate?