VFX Compositing - how do I speed things up?

Hi,

I’m just diving into compositing in Blender. I come from a Nuke and Fusion background and wanted to explore Blender’s compositing tools. While a lot of the tools seems adequate, things start to slow down fast as the comp is processing the full chain every time I make a simple change. Was wondering if there are some optimization techniques that I’m missing. Most notably, the central issue seems to be disk caching or skipping the need to re-render nodes that haven’t changed. Does anyone have any idea on what can be done to speed things up? Thanks.

Could check the options in node editor properties region (hotkey N) under performance, and the tooltips for those. If tooltip display is not on, can enable those in user preferences -> interface.

I don’t think disk caching is a default behavior, it generally occurs when memory is being tapped close to its system limit. Re-rendering nodes after a change is necessary because edits will affect the inputs to any nodes downstream of the edit. Preserving data output from nodes upstream of the edit would require caching each node’s output for every render pass, pushing memory needs even higher and likely prompting heavier disk cache activity. I think in that light a pass through all the nodes after an edit is likely more efficient.

About the only streamlining tip I can offer is to not keep unused nodes hanging around while editing. While they are not part of the noodle evaluation they are still touched by the rendering algorithms, eating a few clock cycles unnecessarily.

This is the default behavior of Nuke (and probably Fusion also). Nuke caches all nodes to disk based on hash values that are calculated over node settings and input node hashes. When the graph is evaluated and node hash has not changed, it is not re-rendered, but pulled from ram or disk. This makes things way faster than rendering whole graph just to apply a different grading op in the end for example. User sets its disk cache size and when it becomes full, data that has not been used for long time or that is faster to recalculate is discarded to make room for new cache data. Nuke gathers statistics on nodes about their average render times and nodes that take longer to render have higher cache priority.

What makes this efficient is the logic how and what is cached. Nuke does not cache full frames for example (although it can be forced), but cache lines based on visible area of viewport and only the channels that user is looking at. There is no need to cache channels that are not used in any node or looked at by the user. Disk space is cheap and you can pull 2K frames at reasonable speed even from standard HDD drives compared to re-rendering everything which can take seconds, minutes or longer for whole comp.

Cacheing is on the road map for development, just not a priority for devs who are focusing on 3D tools. Even elsewhere in Blender like UV image editing there is no straightforward system for cacheing alterations to images.

@kesonmis: I see your point about efficient caching when it’s done as selectively as you describe, but as I said it isn’t in Blender’s default behavior to do this. 3pointEdit’s comment seems to indicate it’s not an option, either. On the other hand, I’ve done a lot of Blender compositing to various frame sizes, 1080p being the largest in terms of pixel count, and have never encountered a Compositor node setup that requires minutes to process. 2K formats and above might require such, so I’ll defer comment on those. However, some of my noodles have incorporated a couple hundred nodes and employed a wide range of FX, and while it is frequently annoying to have to wait through the compositing steps for small changes, it’s not what I would consider a production stopgap. Much room for improvement, though.

Blender compositor does not have many heavier ops, for example there are no optical flow tools, no 3d tools, no nonlinear transform tools (controlled warps) and so on. Simpler 2d ops are fast, but a lot of things that are useful in production are not. And ofcourse the resolution issue is another topic alltogether.

This addresses a matter of design philosophy, I think. Blender’s Compositor is not meant to be a complete tool box for VFX, to compete with the likes of the apps mentioned above, but rather an adjunct to its main purposes, 3D modeling, rendering, and animation. Same for the VSE. Given the need to allocate scarce development resources efficiently and intelligently, I don’t see either post modules in Blender striving for very high end functionality, particularly since so many other tools exist that are already well-established. But even so they are extremely useful and put many useful tools into our hands that might otherwise be out of reach.

I remember seeing this on the todo list several years ago. I’m surprised that it hasn’t been implemented yet. The only thing that I can find on the subject is via openCL. Not sure if this will help you or not:

http://www.atmind.nl/?p=22