Performance in the compositor.

I’ve been playing with the idea of moving all my compositing work to the Blender compositor, but compared to other professional compositing software such as, Nuke, After Efftects, or even Combustion or Shake, it is slow, slow, slow. Like really slow. Unusably slow.

Just loading a clip and playing it back from a viewer in a UV/Image editor is painful. I’m pulling the images from a dedicated high speed RAID over thunderbolt with a xfer speed of >500 MB/s, or 1 gigabyte every 2 seconds. So, my hardware is not the problem. I do this kind of thing in other software as a daily, nay hourly, part of my regular work.

Here’s the setup:

Step 1) In the compositor, import a movie clip.

Step 2) connect it to a viewer node.

Step 3) Display the viewer in a UV image editor window.

Step 4) Ensure your “Timeline Playback Controls” have playback synced for “Image Editors” and scrub the timeline.

Step 5) Slowly shake your head at the playback speed.

What gives? Am I just not getting how to set this sort of thing up in Blender? Prefetching frames and playing them back in the Movie Clip Editor seems okay. This is the kind of performance I would expect in the compositor, but far from it.

You’re not the only one feeling the pain. I’d love to be able ditch AfterFX or make Blender work at the level of the long time extinct combustion (and I do believe blender is way more flexible). But that would require some kind of background render/playback buffer/gpu acceleration that just doesn’t exist in blender’s compositor (or maybe I just don’t know about… I’d love to be wrong here) Seems like we’re light years away from any kind of real time playback using nodes.

I’m glad it’s not just me. The Blender compositor is a great start, but it really needs some serious attention to the basics. It is so enticing because the the areas where it has actually received attention are pretty impressive. In the 3D areas of Blender, it can fully replace high end, very expensive 3D apps. For 2D, not so much.

This topic would be the first place I would start: get playback and rendering in the 2D compositor to operate in real time–not a wild request, being that all other 2D apps out there operate in real time.

Lukas Toenne is working on that problem with blender now with the help of Bastien Montagne I believe. Lukas is developing in stages. He should be in phase one or working on phase 2. See http://wiki.blender.org/index.php/Dev:Ref/Proposals/Compositor2014_p1.1_TD.

That’s good news. Thanks for the info.

Yeah, they definitely know this is a problem.

Another huge deal-breaker is the canvas. For example, if you put a 200x200 pixel image over a 1920x1080 image, then try to blur the 200x200 image, it blurs, but it does not extend the effect past the original borders of the 200x200 image. Because the only image being plugged into the blur in this example is a 200x200 image, the compositor assumes, up until that point in the node graph, that you are working on a 200x200 image. It needs to respect the render size in the same way you set a project size in AE.

Hopefully they’ll get all this fixed sometime. I certainly do my fair share of pestering them about it. :wink:

You are experiencing slow performance because you are perfoming the scrubbing task in the wrong window. To scrub video you want to use the VSE not the Node Editor. Try switching to the Video Editing layout and Adding your movie there. In the VSE I can scrub video as smooth as other video editing programs.

I am not denying there is a slowdown with your approach in the Node Editor but there is also a work around that already exists and was specifically designed just for your task.

Yes Atom is right (as usual) add the compositor scene strip to the VSE window and turn off OpenGL render in preview. You may need to increase the VSE ram cache in the User Prefs.

The compositor is not designed for real-time playback. If you want that you need to either render out the comp to an image sequence and play that back in the sequencer or insert the compositor scene as a scene strip in the sequencer using another scene. The latter will be slow the first time because it caches the frames but will play back smoothly afterwards.
Never used AE but from what I hear ramcache does exactly that (i.e. under the hood rendering).

Whoa, was only a few minutes late :wink:

Yeah. Im so pissed that Atom beat me to it… :wink:

That may work, but to be fair, coming from a professional compositing point of view, the compositor DOES need sped up. After Effects, Nuke, Fusion, they all work in near real time. But they do definitely know, and they will work on it. :slight_smile:

Horses for curses of course, but still the functionality of nodes for compositing is sadly lacking in vse’s design.
I just wish the VSE and compositor would talk to each other. As of today they seem like dysfunctional marriage.
I love the compostor as a node based way of mixing and affecting images, it’s a very flexible and powerful way to create complex effects effects in no time. And it’s because of that flexiblity that I’d love to composite and alter video clips using defocus, color split, blurs, color ramp nodes or the keying and tracking tools from the compositor in some kind of buffered realtime preview, that’s all.
I’m well aware that blender is foremost a 3D content creation application. But when I look at the workflow of Nuke, Mistika, Smoke and all the big compostors seem like what they do in those high priced pieces of software is almost doable in blender…

Whilst complainig about speed, it would be pretty sweet to actually leverage the OpenGL renderer for integrated compositing. It’s great and real time just has some drawbacks with alpha. But I wish I could define it as a scene source. I can render it out so why not port it to the compositor?

“You are experiencing slow performance because you are perfoming the scrubbing task in the wrong window. To scrub video you want to use the VSE not the Node Editor.”

Not what I am trying to do. I am trying to composite in real time. That would include pulling keys, transforming layers, applying color corrections, rearranging color channels, etc.

With wildly complex node trees, I might need to play back at lower frame rates as the results are cached in RAM, but eventually I should see a real-time playback. I cannot, seemingly, pipe the output of a comp into a video sequence editor window.

Playing back video in real time is easy. I want to composite.

“The compositor is not designed for real-time playback.”

Sounds like time for a redesign, then. If you are going to begin work on a compositing app, it should work well for people who are professional compositors. I’m not suggesting anything that isn’t the most basic functionality of AE, Nuke, Shake, Combustion or Autodesk Composite (which is now free).

I cannot agree with cegaton enough. “I just wish the VSE and compositor would talk to each other. As of today they seem like dysfunctional marriage.” Kind of like the image processing filters available in the compositor but not in the material editor. All image processing nodes should work with all image inputs.

Blender is on the cusp of becoming the biggest VFX app out there, or at least it could be. And the roadmap shows it is really headed in a very exciting direction.

It would just be nice to see some attention placed on some areas where it is already sooooo close.

Cacheing the compositor is a real challenge, as every node point may require a full 32bit frame. How many buffered branches is enough? I guess you could flush a portion of the tree that is modified, but depending on the branching you will be back to re-rendering the whole thing.

Also dynamic scalability is an issue as many coordinate values are not relative and would give an incorrect value when scaled.

I’m a Nuke compositor and a Blender user. I’m working with Nuke everyday and I can tell you that Nuke isn’t a real time application at all!
In comparison, Blender is slower indeed, but it’s not so bad. Compositing, like 3D, is a heavy process. Do not hope to see soon a real time compositing software.
But, the commercial softwares you’re talking about have cache, what Blender I believe doesn’t have : when a frame or a range of frames are computed, you can playback them.

In fact it was recently redesigned to meet the production needs of TOS. :slight_smile:

If you are going to begin work on a compositing app, it should work well for people who are professional compositors.

Uh…Blender’s node editor IS used by professional compositors (whatever that may mean).

I’m not suggesting anything that isn’t the most basic functionality of AE, Nuke, Shake, Combustion or Autodesk Composite (which is now free).

these are all valid points and there’s no disagreement. We all want the compositor to develop but we’re also thankful for the tool we’ve got right now :slight_smile: