camera size versus render size?

how do you relate the camera size angle
versus the render size in scale X and scale Y?

i mean if you change the angle how will this affect the render
or vice versa?

???
do you mean the dimensions like if you render 1500*1000 you get a aspect ratio of 3/2
but camera size angle i don’t know what you mean

yes what’s the relationship here

let say you have a camera with 35 degrees angle
but you render with 1200 x 800 pixels

how do you link these 2 set of values?

it’s almost like the render is changing the camera setting or not !

The size of the rendered image and the angle or focal length of the camera are completely unrelated. If you change your 1200x800 pixels to 600x400 without changing the camera settings you still get a picture of exactly the same scene, just at lower resolution.

Best wishes,
Matthew

yes but still does not show what the relationship is !

like with
AspX and AspY which control the packing of the pixels along the respective axis.

this seems also to affect what you get in output render!

let me explain a problem i encountered while rendering

sometimes when i render as usual i get a very nice render / picture
it does not look low res or like cheap resolution
but other times i get sort of a low res render but not certain how to get a render of nice quality
and i did not really change anything in the render setting

so why do you sometimes get a nice high res render and other time you get a very low res render
with about same settings for the render!

now i tried to increase to max sampling and change material variables like diff or spec ect.

and still get a cheap render low res!
so what can be done in theses case to get a nice looking render high res?

camera angle

what is this angle is this horizonal or vertical angle ?

Whichever is larger.

Best wishes,
Matthew

What you mean with “other times I get sort of a low res render” ? Are you telling that you render an image of 800x800 pixels and sometimes you get an image of 400x400? Perhaps you changed the 100% to 50% or something like that just below where you specify the width and height of the render.

The focal length of the camera is about how perspective distorted is going to be the image. So if you place the camera very near a face and play with the focal length you can see how it gets distorted with low values or very little distorted with high values. A value of 65 is what human vision uses. The photography cameras use very different values.

ricky i don’t know if this will help, but i did an experiment:

set up a camera looking directly down z at a plane 2.0 x 2.0 bu lying at 0,0
set the render resolution to 400 x 400
look through the camera
adjust camera height until the plane exactly fills the whole camera window
check the camera distance from origin
then in side view measure the angle of the camera icon (using a triangle and face angles)
repeat this for a few different focal lengths

the standard relationship for camera view angle to camera focal length is

tan (angle/2) = image height / (2 * focal length)

apply this formula using
angle = camera icon angle
height = height of the plane = 2.0 bu
focal length = camera z value
and it is consistent as you would expect

then apply the formula using
angle = camera icon angle
focal length = focal length reported by blender
to calculate height

for 5 different focal lengths, this comes to 32.0 (+/- 0.1) every time

in my mind this 32.0 must relate to the 400 resolution value
but i am not sure how

anyway, just some ideas - needs more work!

32.0mm is the notional size of the film or CCD sensor in Blender’s camera. For any resolution.

Best wishes,
Matthew

32.0mm is the notional size of the film or CCD sensor in Blender’s camera. For any resolution.
ok, so my theory was wrong

it’s easy to demonstrate that no matter what resolution settings (X = Y) are chosen in the render panel, the 2 x 2 plane still fills the camera window and 32.0mm is still the result. now i know what it means

anyway, at least we now know one of blender’s fundamental constants :slight_smile:

is it true then, that a bigger resolution simply crams more pixels into the image? But as far as blender is concerned, the image is still the same size?

Exactly.

Best wishes,
Matthew

thanks for the formula i was looking for this one on Wiki and could not find it!

so camera size angle simply determine the angle of view seen by the camera
then the number of points per line vert of horz is determined by the render pixels Horz and vert
and the scale factor X and Y

so if i want a higher res look only way i can see is to render with an appropriate ratio with large value of size
like 3000 X 2400 pixels to get a nice render image?

i 'll make a test

but even that don’t solve my problem
the problem as i explained is that sometimes when i render a scene i get a nice rendered image
other time seems inpossible to get nice looking render
it looks like low res image!

so basically the problem is that why do you get low or high res render with basically same settings function of the scene or file you work with?

when i render some scene it’s almost like i cannot get high res nice render

it looks very low res and it does not matter if i increase the Sample to maximum
i can improve a little if i add subsurf level 2 or 3 but even that cannot give nice render!

so where the problem?

let me find an example of one model with which i struggle to get nice looking render image
be back later

[ATTACH=CONFIG]137341[/ATTACH][ATTACH=CONFIG]137339[/ATTACH]here is example for a model i did and i tried to render with differen size
and if i don’t apply the edgespslit i get poor quality at all res!

the worst being in 2.49

but when going to 1200 X 1000 and edgesplit it’s beginning to look a lot nicer!

i guess the pic size is reduce here and difficult to see the real quality !

first pic is one without the edge split modifier in 2.54
and we can see distorsion in picture

edgesplit in this case seems to improve quality quit a lot more then using subsurf!

hard to say what the problem might be …

is it possible that changing camera focal length might affect the quality of the image in some way (other than resolution) …

just guessing here

and this is one example
but i’v seen other like that where i get low res and not certain why it looks like that
ok it’s not in all case most of the time i get a nice render!

but would like to know waht to do in these case special case

i may have found that using edge split might help

anyway if you have any other ideas to improve it let me know!

problem there is still a bug with edgesplit right now in 2.57 on bugs report
so hope it is corrected soon 1