Way to have Cycles not render empty space

I could not find any reference to this in the forums. When I render out my scene in Cycles, with all objects on their own render layers, Blender wastes a lot of time and resources rendering areas where there is nothing, all transparency. Is there a way to tell Blender and Cycles not to render this empty transparent area and only render the objects that are visible on their individual layers?

At the moment there is no such Option for Cycles, it’s just part of the raytracing algorithm. There would have to be a preprocessing step to determine empty parts of the image and that’s one thing Cycles wants to avoid at all costs for now. One way to avoid too many Calculations on empty parts of the image is to use the non-progressive Iterator with a low Amount of AA Samples, so empty areas will get only that many Samples. Another way is rendering parts of the image with border rendering and piece them together in postproduction.

I’m no programmer, but it seems like Blender could determine where said object/objects are and create 2d bounding boxes that would then dictate where the render focuses on - sort of an automatic border rendering. I can’t border render right now because I am rendering hair using particle systems and I find when I use border rendering it messes up my hair in the file permanently. Thanks for the reply!

How do you detect it waste time on that areas as you not programmer? Transparency or not, it is bounce, and must be calculated. Add DoF to it, motion blur.

In camera mode (numpad 0); shift-b to camera-select; then select the area of the camera view you want to render. It will show with a red-dotted square. Do the same, but select the whole camera area to return to full section. Or there is a checkbox/button to click somewhere - I forget where as I find it easier to use the short-cut.

Note: if you are rendering multiple layers, it will do the same for all layers selected.

It’s simple logic, storm_st - considering I can render using Border Render and it takes less time doing that and yet still renders it the same. However, as I mentioned before, I cannot use Border Render because it causes problems with hair particles.

Use the Non-Progressive integrator with only a few AA samples, and many Diffuse, Glossy etc samples.
This way, the sky will render much faster.

It will still use any indirect light from those areas outside of the border render, hence why it is the same.

What you want is any areas which would be transparent in the final output to not be rendered at all, however for Cycles to know if that area is the background/world, it has to render it first :stuck_out_tongue:

That’s why that was my original question. And no, Blender does not have to render that area first. That’s not how renderers work. That is why you can use a Border Render and the rest of the scene is not rendered but your image still looks the same - which is what I meant - not time wise. And as I said, if I border render then it is FASTER than having Cycles render the entire thing. I could, of course, set up border render and render out one series of images for one layer. When that finishes go back in, set another border render and do it again for the next layer, so on and so forth. This would cut down on my rendering time and give me the transparent background that I want but would not be ideal as I would have to be present for the next series. All I was wondering is whether or not Blender had a way to ignore the transparent areas when rendering instead of running passes over the area that are going to be transparent - or alpha in the image. I can do this manually with Border Render normally but cannot because there is a bug with hair strands. I am not talking about transparency of say, a window. I’m talking about the alpha part of the image where there is truly nothing - such as wanting to isolate a box and composite it on top of another image (case someone was thinking differently).

Of course it is, it has to render less however…

Except that Cycles STILL traces ray’s in the hidden parts of the scene, as these affect what we do see. That’s why Cycle’s will have to render the whole image first, if it wants to know what is ‘blank’. You could (if you wanted to :P) use BI to find this transparency, and use that to drive where Cycles shoots its rays, but thats up to you xD

DingTo’s method is quite close to what you want, as each AA sample would use few samples rending the blank sky, and more on other parts of the scene, that contain actual materials.