Blender lens to nikon fov ?

Hi all,
I just would know if someone here knows how to convert the blender lens/fov units for matching a photo shoot from a nikon reflex camera, for an archi rendering ?

Exactly matching an optical camera’s results would require that the 3D app use a lens simulation rather than just calculating a nominal “focal length” from a specified angle of view (sometimes called field of view or FOV). This means that the Blender camera and an optical camera may not agree even if each uses a lens of identical “focal length” in mm.

I’ve found it useful in some instances to construct a simple mockup of a real-world scene (if you have enough info to do this somewhat accurately), then compare the perspective effects of one virtual camera setting or another until things match fairly well. Since using a longer “focal length” flattens perspective and a shorter lens exaggerates perspective, you can usually narrow the value down to an acceptable range pretty quickly. A lot depends also on how absolutely accurate you need to be in the match.

Thanks, I will also try to use Icarus camera calibration to compare all this .

Icarus Calibration can give you an estimate of the “taking” camera’s focal length (FL), but you already know that don’t you?

The Reconstruction module might be helpful in prepping a model for Blender to use when setting up your match scene.

Here’s the result of a test I did using the Icarus Calibration and Reconstruction modules:


I used a photo I’d taken to test the operation of a plug-in I wrote a few years back for Pixels 3D (pre-OSX and OpenGL) to allow using lens settings in mm rather than the FOV parameter the app used. The photo (inset at lower right) was taken with a 35mm SLR using a 55mm lens (pretty standard combination).

I used the perspective line procedure (Single-image Calibration) in Icarus to set up the virtual coordinate system – it took a few tries to get it accurate, since the useful image area from the scanned from the photo isn’t that big – then used the Icarus primitives (Reconstruction) to build a matched model of the objects in the photo. This was exported as a .lwo file.

After importing the model into Blender I was surprised to find that no camera location was exported – just the meshes – but it only took a short time to set up a camera and get it into position. I used the original photo image as a BG to align the camera.

Note that to match the perspective of the photo image taken with a 55m lens on 35mm film (image aspect ratio = 1:1.5) I had to use a 44mm “Lens” spec in Blender. Can’t say why the disparity because I don’t know how Blender’s “Lens” settings are calculated.

It was interesting that Icarus choked a bit on the perspective of my photo – because I intentionally set it up with the ground plane grid having single-point perspective (or as close as I could manage), the main perspective lines for the X axis don’t converge. This seemed to result in Icarus calculating a focal length in the millions of mm (!), but they do note on the docs that it’s best to use lines which are definitely convergent whenever possible. The virtual coordinate system and Reconstruction were still pretty accurate, though.

Thanks a lot for the answer !

Yes I know the FL at wich the photos were shoot, but since it was taken with digital relfex, the size of the sensor is smaller than a kodak 35mm film …

So we had to match the shootings by the hand !

If I could know the fov of the reflex camera, wether the sensor size(width) is, it would be much easier …

I maked a try to calculate the diference between the Nikon and Blender Focal Length a while ago.
nikon*x=blender
The problems are two:

  1. The digital sensor of the reflex camera isn´t 35 mm. 35 mm film had become a sort of standard for reflex cameras, but the nikon digital ones uses a smaller ones. The FL for a Nikon D100 (my dad has that one) is multiplied with 1.5 to get the same FL as on a 35 mm. So if you shoot with for example a 50 mm objectiv the true lenght compared to a 35 mm film reflex camera is 75 mm.
  2. Blender are non of those!!!:eek:

I came at last up to this:
-If the pic was shoot with 35 mm film or a sensor of the same size, then the Blender FL is 0.914 smaller.
ex.
35 mm FL = 50 mm
50 * 0.914 = 45.7 in Blender

-So if it was shoot with a Digital Nikon (I think they all has the same size) multiplie the FL with 1.371, instead of doing FL1.50.914. Skip the *1.5
ex.
FL on a digital Nikon reflex camera = 50 mm
50 * 1.371 = 68.55 in Blender
(50 * 1.5 * 0.914 = 68.55)

The best way would be to know the angle in degres that can´t be wrong.
Hope it will be usefull and that it´s correct, didn´t got the time to try it so many times yet.:cool:

The numbers to remeber: 0.914 and 1.371

I was told the the ratio should be between SLR and DSLR standards, appromatively 1.41 …

But It seems hard to find information on camera sensors and DSLR standards, especially for calculating the FOV, the most useful parameter I think !

Thanks a lot Mazz !

Here’s a link to some info on various digital camera sensors:
http://homepages.tig.com.au/~parsog/photo/sensors1.html

Also, it wasn’t hard to look up tech info on Nikon digitals, there’s pages of it on their site, model by model. delic, you didn’t say what type of Nikon, so you’ll have to look it up. The info i found was for the D2Xs and the D200, D80, and D40 models, including the sensor dimensions in mm.

I’ve also been crunching some numbers to try to determine how Blender correlates its FOV settings (i.e.,“Lens” setting in degrees, “D” button) and the focal length (FL) in mm (“Lens” setting in mm). Using a chart I made to test some algos to determine an FOV to match a lens FL in another 3D app, I came across a situation that’s hard to resolve.

I set up a scene whereby I could calculate an FOV based on the target lens FL and a target aperture dimension (the standard method of determining FOV in optical systems), then test that FOV based on how it renders a specific target object. The calculated FOV worked perfectly, but Blender’s FL in mm did not match the FL of the lens I was calculating with. Specifically:

A 50mm lens imaging to a 35mm 1:1.33 motion picture aperture (dimensions = 24.873 x 18.655 mm) would require an FOV setting of 27.974deg to work successfully in my test setup, which mimics an optical lens system with 1:1 magnification, basically an ideal thin lens simulation. Plugging this FOV value into Blender, the test renders exactly as expected. However, this FOV value is apparently equivalent to a 64.32mm lens setting in Blender, not 50mm. Try as I might I couldn’t reconcile this difference.

I tested other aperture values/image aspect ratios (motion picture film specs, btw) and their corresponding FOVs for a 50mm lens, and got exactly the same results – the renders are fine, but not the correlated Blender FL in mm.

Does anyone know the math Blender uses to calculate a FL in mm from a given FOV in degrees? I’d like to clear this up, because I’m trying to figure out a system whereby a virtual lens’ FL can be expressed in pixels rather than mm, which might simplify matching FOVs from one digital rendering system to another. But to do this and test it properly, I need to know more about how Blender’s camera FL (in mm) settings are calculated, since they don’t seem to match my expectations.

Wow, this is unusual – such a great application is Blender, but it seems that a fundamental aspect of its functionality, the camera, is not at all very well-documented.

After a great deal of searching in various blender community resources (code base, forums, etc.), I’m coming to the conclusion that no one really knows how the “Lens” setting for Blender’s camera is calculated. A number of forum posts indicate that it is known to be at best an inaccurate emulation of a real-world lens focal length.

I can say that from the tests I’ve made, it does not seem to correspond well to any real-world parameter of a lens, such as focal length in mm. The “Degrees” option (essentially FOV, aka field of view or angle of view) is very useful and tests as very accurate, but the corresponding “Lens” setting in (ostensibly) mm doesn’t match any combination of focal length, image aperture, and magnification that I can determine. So if you’re trying to match a real-world camera image with a Blender scene, don’t bother using the “Lens” setting’s default option, instead calculate the required FOV and enter this in the “Degrees” option – it matches real-world parameters much more closely and will give predictable results.

Of course this requires knowing how to calculate the proper FOV to use in Blender, which is a matter of fairly straightforward math (a little algebra and trig), but it also requires knowing such things as the dimensions of the physical image-recording area of your camera, which, as has been noted, is not as easy for digital sensors as it is for film gauges, which are very well-documented. Digital sensor dimensions are also not as accurate, because the entire area of the sensor is often not used (for example, some Nikon Ds have three different pixel resolutions that can be chosen, resulting in slightly different “image areas” on the sensor that would take detailed knowledge of the sensor architecture to figure out).

So I’m looking into using the math I developed for calculating virtual camera FOVs to match real-world lenses used with optical film gauges, to generate a useful process for digital cameras, using pixels as the basic unit, rather than a physical unit such as mm.
The idea is to see if a “virtual FOV” can be used instead of a “physical FOV.” Since I currently don’t have a quality digital camera, if anyone would be interested in helping out by shooting some test shots as the project progresses, it’d be very helpful. The object is to make Blender much easier to integrate into a workflow that includes both film and digital imagery from many sources.

With this link we could match your values …

http://www.panavision.co.nz/main/kbase/reference/calcFOVform.asp

The link didn’t work for me for some reason. Is it some form of FOV calculator?

I dug into Blender’s code base and came up with some more info – it seems that the default “Lens” option is not actually a simulation of a real-world lens focal length in mm. Just what it is, no one really seems to know for sure :rolleyes:

However, the “Degrees” option in the “Lens” setting is a standard FOV setting, and that’s fairly easy to calculate for any real-world lens given some basic parameters. Currently I’m in contact with a Blender scripter who wrote a camera-matching script a while back that was more accurate than the Blender code, but still not as complete as it could be. I’ll be providing him with my equations and some other info, with the goals of producing a script to automate some aspects of scene matching in Blender, and an online “manual” with tips for making real-world and Blender-world scene matching easier and more accurate. This includes shots done with digital cameras, which are problematic because of the nature of their sensor chips.

Can’t say when all this will be done (I have a first draft ready now, but need to do some testing and web-page construction), but it’s my priority project these days.

If you’d care to help in the project, it’d be great to have a number of digital photographers/Blender artists trying out the system and helping confirm its accuracy.

Sorry for the wrong link, it’s reedited correct now.
Yeah, a FOV calculator, but there’s another one in which you can enter the sensor size you want.

http://www.frankvanderpol.nl/fov_pan_calc.htm

Yes I’m intersted testing that. The photographer too …

I checked the calculator again, it’s working now, thanks.

A couple of things about how it works:

  1. It uses the math for calculating FOV based on a lens focused at infinity. This isn’t always the case, of course, so it won’t be as accurate as it could be in all cases. For example, close-up shots would be particularly prone to error.

  2. It reports HFOV and VFOV based on a portrait orientation, which is not often used for motion picture photography, so users will have to switch those figures. Not an error but perhaps a bit confusing.

  3. It uses sensor size (probably from published specifications) for the “film width” of digital cameras, which may not be all that accurate a description. There’s no data I’ve found that states that the entire sensor surface is used for imaging (in fact, there’s some evidence to the contrary), or that the dimensions given are for the actual image-forming area of the entire chip. This could introduce error in the FOV calculation.

My approach is to eliminate as much error as possible in the FOV calculations by adjusting for focus at other than infinity, and by determining the physical width of a sensor’s image area based on the images it actually forms rather than assumptions about how a chip is made and functions in a camera system (only the engineers would know that in detail).

Of course, since most of my approach is mathematical, I’d want a battery of tests run to confirm it in the real world.