Scale-Related Rendering Glitch

Hello. I am pretty new to Blender, so I am sorry if I am wasting your time with a simple question. Unfortunately, I have tried everything I could think of and I am still having a strange performance issue when rendering my animation. My scene has a small, relatively detailed model, and a large, fairly simple background environment. If I render this scene as-is, it takes unusually long to render. If I remove either the model or the background, it renders almost instantly. Interestingly enough, it also renders quickly if I scale either part (the model or the background scenery) to be the relative size of the other. What is causing this problem? Is there a simple way to make things work correctly? Thank you. :slight_smile:

Do you have AAO anabled and are you using raytracing?
AAO is not multithreaded and can only use one thread for the occlusion calculation this can cause great increase in render time especially in animations as it increases rendertime on a per frame basis.
Raytracing depends on the size of the scene and the reflections and increases render time as well. You can try to speed it up by playing with the octree resolution (Details: )

For the AAO issue there is a workaround at least for animations (this is only theoretical as I have not found the time to actually test this):
You basically setup your blend file and make sure that you have Touch and No Overwrite activated then you save it.
Now you go and start blender as a commandline app and instruct it to render your blend
See for CL Interface.
Use the -t switch to limit the threads to one.
For each core you have start a blender application from the commandline using your saved blend.
If everything works well each app should only render one frame using one core maxing out the use of all cores as long as there are frames left in the animation.
Be carefull as memory is increased by each application you start.

As Musk stated it very likely has to do with Ray Tracing and it’s associated Octree Settings. Optimal octree settings can only be discovered via trial and error as they are not entirely predictable.

Otherwise you can use buffer shadows which are much faster to render but currently only available via spot lamps.

You’ll likely have better luck by adding a new scene (Link Objects) — one scene each for foreground and background — delete the opposing objects per scene and composite them together with nodes, VSE or both.

The octree article you linked to was an absolutely perfect description of my situation! Thank you (times a hundred) for the valuable info. I haven’t had a chance to test anything yet, but that article seems to be exactly what I am looking for. Thanks again for your support and happy modeling. :slight_smile: