what settings should i have the camera at so that i can place it at a human height for realistic (people perspective) walkthroughs?
It depends on the scale of the environment and the lens. If you’re using 35mm (the default lens), then maybe your camera should be 2 units off the xy grid (people are about 2 meters tall).
i have the scale of the scene at 1 unit to 1 foot. so my camera is six feet in the air. what should i put the lens at? 35 doesnt look right ot me.
Well blender is a European program, so if there was somekind of a lens to unit relationship or whatever, I don’t think it would be based on feet. You’d be better off assuming the 35mm is a human perspective and placing the camera +2 units on the Z-Axis, puts it at eye level. The scene would then be scaled accordingly.
I did a rought test of this, and I think 35mm at 2 Units is a good place to start. Could be wrong, though.
I would say play around with it until you get good results…
Human view can’t be quantified with a standard “lens” value, there are many things that make up vision… you are better off approx. a normal camera lens imho.
Uh. I thought (I could be wrong) the lens has nothing to do with distance. That number is mm as in telephoto or wide. Human eyes are, or so I was told, about equal to a 55-60mm lense.
-Laurifer
yep, correct, use a 50 -55 mm lens for a correct fov.
Hi !
The most realistic lens for simulating human sight was the HELIOS-44M-4, mounted on the well known ZENIT photo camera. It is a 58mm.
I have got one in my hands at this time :You see through the lens exactly the same picture that you can see with your other naked eye !
For this experiment, the photo camera must be set vertically, in order to not be disturbed by the shift of the pentaprism.
The only difference is that human have two eyes separated by 65mm (average distance named stereoscopic basis), so the resulting field is slightly superior horizontally. But this detail is negligible.
Philippe.
I doubt that very much, your eye doesn’t see a rectangular “frame” like the camera does. It may be a close approximation, but not “exactly” what you see with the naked eye. Everybody’s vision is not exactly alike so any lens from 50-60 mm will give you a fair approximation of the average eye’s field of view (horizontally that is, severely cropped in the vertical dimension unless your working with a 1:1 aspect ratio).
<edit>The Blender default is actually a good optical match for an average eye. The 50-60 choice is more from a perceptual aproach. We don’f focus our atention on everything that’s in our field of view. Just thought I’d mention that.</edit>
I remember a 49mm focal length for 35mm format was a good approssimation of (single) human eye vision.
of course we’re talking about resembling deformation/perspective, not aspect ratio, that would be a totally different issue.
nic
Ehm really I think this is all kind of silly… you may be able to create an image with the same distortions your eye creates or with the same focal lenght, but in the end your image will still be seen through an eye which again has lenz distortion and a focal length and everything… so a recreated “eye” will not look natural.
Do what MacGyver says… just play with it.
Hi !
That’s true that human eyes have not a square or rectangular vision field.
Sorry, I answered too quickly ! I remember now that 58mm is the true average focal lenght for 35mm films. The less distortion is achieved at his focal.
Borrow a Zenit with a 58mm lens, and you will see by yourself ! :o
But there is an important thing that must be taken in account :
The focus must be converted in equivalent focus because of the difference of the surfaces between the sensor !
The real focal lengh of the human eye is short, probably around 38mm.
it is the same problem that is encountered when you compare argentic and digital photo : the CCD sensor is generally smaller than the film surface, and to match the same surface, the focal lenght must be changed !
To get the same result with different support size, you must use different focal lenght.
It is a nonsense to compare with a Blender virtual camera, because it has no real support size !
The solution seems to be the try and see method ! %|
Philippe.
It does, but it’s in pixel when the focal length seems to be in Blender units. Not an easy conversion to make.
Hence my comment about the 50-60 choice being a “perceptual” one suited for a 1.5 aspect ratio.
I don’t know much, but I know that the focal lenght of the human lens is variable because it can change its shape (flatter -> rounder).
The angle of a human eye (horizontaly) is 120 degrees (hence also those 16:9 ratios)
Unfortunately, to make things even more confusing, the ‘lens’ parameter in Blender apparently isn’t a focal length in mm. It’s related to the angle of ‘field of view’ or something.
ideally, when making renders and trying to make them look good, you don’t want to make the render look exactly like the object. Rather you want the render to look like a photo of the object.
Take for example a light bulb.
If you look at a light bulb, you see a white sphere, but if you take a photo of a light bulb, you get a white glow, with a halo and a lens flare.
While you could model and render the white sphere, it wouldn’t do you much good on the realism front. You’d see the flat image on the screen and your brain would think “Flat image of an object? Must be a photo!” and would be upset that it didn’t look like a photo of a light bulb.
This is why there’s so many effects in CG that mimic camera artefacts - lens flares, focal blur, motion blur, etc, etc. Even though these effects don’t occur when you watch an object in real life, when you see a render without them it looks fake.
This is what photorealism is all about - not just realism, but looking like a realistic photo.