Blender's camera focal length / sensor size values inaccurate?

One of the things I do at my job is creating illusions. I use Blender a lot to R&D the designs/builds.

My next project’s scope is pretty big, so I measured out the physical space we’ll be building in and created a virtual representation of it. The intention here, of course, is to be able to precisely work out the placement of the camera and rely on that to design what we need to build.


The camera I’m using is Canon 5D Mark IV. It’s got a full-frame sensor (36 x 24 mm), but it apparently has a crop factor of 1.74x when shooting in 4K. The 4K cropped sensor size doesn’t seem to be listed anywhere, so I looked into figuring this out.

If what I understood is correct, to get the 4K sensor size, I need to divide the diagonal value of the full-frame sensor (43.27mm) by the crop factor:

43.27/1.74 = 24.8678

I punched in the resulting value as the sensor size, and set the focal length to match the lens on the camera (16mm).

But this is where I run into the issue.

What Blender’s camera sees and what the physical camera sees are different. Blender’s view is significantly wider. I tried manually dialing down the sensor size, and it appears to match the real camera at 20mm.

Is my math wrong? Or is Blender’s representation of camera values not reliably accurate?

Your result is a diagonal, right? And AFAIK, you can only input either width or height for the sensor. Does 20mm (width) really look like a match, or would 21.996643 match better?
Given the aspect ratio (from resolution) and the hypotenuse (diagonal) you can calculate width and height.

1 Like

Ah, nice catch! Yes, my math would’ve figured out the diagonal value of the cropped sensor, so I would have to calculate the width using that number and aspect ratio (which I’m unable to do even though I’ve tried).

Just out of curiosity, do you know what the equation would be to solve for the width given the aspect ratio and diagonal?

20mm is still much closer than 21.996643 though (latter value still looks too wide)

A co-worker suggested solving for the width by putting that over the known width (x/36mm) and making that equal to the known diagonals:

x/36 = 24.8678/43.27

This comes to be 20.6896, and is much closer, but still a bit wider than the shot in camera. I’m thinking of doing one more check through the placement of the virtual camera and the physical camera to see if my placement was off.

Well, my thinking was this:

  • d² = w² + h²
  • A = w/h
  • h = w/A
  • d² = w² + (w/A)² = w² (1 + 1/A²)
  • w = sqrt(d²/(1 + 1/A²))

Where A is aspect ratio (horizontal resolution divided by vertical), d is diagonal, w and h are width and height. I took resolution to be 4096 by 2160 (to arrive at the width of 21.996643), not sure if these are accurate numbers for your camera.

I think using the width for proportion would only work if the triangles are similar, which they might not be necessarily.

Haha, thanks for writing that out. I do have to say you lost me in the last two steps in terms of how you got there. Also, unless I’m mistaken, I’m not sure if they’re working (tried plugging in values and got different results), but I’m not a mathematician.

Anyways, the proportion method does seem sound to me. With the same aspect ratio, the resulting triangles would seem to always be proportionally correct.

I wonder if the small discrepancy might be due to the fact that the real room (which is a built set) and camera setup is actually not perfect like I have it in Blender.

Yes, if aspect ratio is the same for full frame and cropped, then the triangles would be similar and the proportion should hold.
I wonder though, if the discrepancy is coming from the fact that you’re inputting the actual camera’s focal length. I mean, given that you’re setting sensor size to cropped, whether you shouldn’t be inputting the effective focal length.

And a 3D camera is always ‘picture perfect’, with a real camera/lens there always will be some distortion going on that could be just enough to throw off the match.
I’ve seen this happen multiple times with tracking video footage.

Just a thought on this. :wink:

1 Like

Right, I’m inputting the actual focal length. I wasn’t aware I need to recalculate the focal length as well. Do you know how I would get the effective focal length?

Oh, I’m more hypothesizing here. If I’m not mistaken, the effective focal length would simply be the focal length times crop factor. My reasoning here is since there’s no cropping going on in Blender and you’re emulating it by adjusting the sensor size itself, so should the focal length be adjusted (instead of being implicitly increased by cropping).

2 Likes

Holy smokes, you know, I think that’s it!

I multiplied the focal length (16mm) by the crop factor, punched in the resulting 27.84 as the new focal length, set the sensor size back to 36mm, and result is probably the closest approximation to what I see in real life.

I don’t understand why this works and not calculating the cropped sensor size. Maybe this means the math I’ve done to arrive at the cropped sensor size is wrong.

Anyhow, thanks for the suggestion. Provided the math checks out, I think you helped solve it!

2 Likes

:partying_face: :partying_face: :partying_face: :partying_face: :partying_face: :partying_face: :partying_face: :partying_face: