So I just found the right section to post my questions.
Its because the area on screen it has to render is smaller. Less pixels to render.
That’s not true, it’s still calculating the same number of samples and pixels as if it were fully zoomed.
I’d have to play around with it, I’ve never experienced this before. Probably something to do with the image buffer if it turns out to be true.
Because more of the render includes blank sky space, thus less complex things to render?
I think we need to clarify something here, because we’re getting answers with two fundamentally different sets of assumptions about the conditions Atom is describing. Atom, when you say “the render window is zoomed out” do you mean:
A) in the 3d viewport, the camera is zoomed out away from the object you’re rendering, such that the object is small relative to the total scope of the rendered image?
B) the window that displays the render result is “zoomed out,” meaning the actual render result image appears small relative to your screen, so you need to roll your scrollwheel to enlarge the image?
It could be a number of reasons, perhaps it’s because the most render-intensive areas of your scene are now taking up less screen space (noting that some shaders take a little longer to render than others), or maybe it’s just because you have an object surrounded by sky as mentioned and there’s now a lot less area taken up by the object.
B man. It’s B.