Sky dome... can you explain this please?

I have a UV sphere. The normals are flipped to the inside. It has a 360 outdoor HDRI image hooked up as the texture.

The camera is inside the sphere, pointed to the Y direction.

If I scale the entire sphere, I thought that the image would appear to get further away, smaller. It does not.

But…, if I scale only in the Y direction, I see what I expected.

Why is that?

whyY.bmp (33.2 KB)

Thanks,
Cal

Hi,
it depends on how you are mapping the texture. If you use an environment node and object cords it will work as you expect. (or maybe the reverse!)
Scaling up the sphere will make it look nearer, scaling down further away, because when you make the sphere bigger you will be seeing a smaller portion of the image.

If you’re scaling the whole sphere up, and your camera is sitting at the center, then every point of the sphere is being projected along an axis that’s aligned with your view vector, so the image doesn’t change aspect from the camera pov.

If you need to capture more of the scene into your camera you need to decrease its focal length.

3 Likes

I did not try it with the camera right in the center!

1 Like

Yea, with a camera offset from the center you would see distortion, in the form of the opposite half-sphere moving away. But from the sphere center all points move along that view vector so their projection to the camera frustum does not change.

2 Likes

Or, move the camera back and forth - whch works too