Choose sequence editor strip as input into node group?

Is it possible to route a sequence editor strip through a node setup? Like say a bleach bypass node group to do some magic bullet type filters on selected sequence strips?

Then you wouldn’t have to rely on someone coding suitable sequence plugins when nodes are so much easier?

I know you can import video or image sequences into nodes but that doesn’t allow you to make use of the editing features in the sequence editor at the same time.

Sadly it’s just possible the other way round for the moment.
You can setup your nodes and use a sequence as input in the node editor, the add the scene in your sequence editor.
But depending on your setup in the nodes the preview in the SequenceEditor might be no fun.

There is a patch for this but you’ll likely not see it in an official release (anytime soon anyway) because it doesn’t fit into Ton’s vision for the future of Blender. Unfortunately the patch seems to be closed so you can’t even look at it. You may be able to contact Brecht Van Lommel (I believe it was his baby) and see if you can gain access if you feel like compiling your own build.

It would be really cool to add colour corection to sequence strips via composite nodes. I asked for this some time ago and got the same answer, infact someone said pre render. But this means having a different .blend for every shot! yuk.

@ David:
Pipe your renders through any color node (mix, RGB curves, etc) then enable do composite and do sequence. That’s the only direction it’ll flow.

Can you be a little more specific, ie step by step? I tried but failed…

Just render your stuff out and then use it as an image texture.

Yups, I already do pre-render the node output(s) and put them in the sequencer when I want to join the clips etc. But having node output(s) directly in the sequence editor would be uberCool. A “Output to Seq_clip” node would be pretty cool. You could create multiple trees, or put multiple outputs wherever you want in your ndoe hierarchy, and put/mix them in the sequencer.

Unless I’m missing something in what you’re saying, zemmuone, you can already do exactly what you’re describing.

Just set up your node structure and be sure to click “Do Composite” on the “Anim” tab. Then go to the VSE and add a scene. The scene you choose will appear as a sequencer strip, and, since you have “Do Composite” clicked, the scene output is routed through the node compositor. Of course, you’ll also need to click “Do Sequence” to get the VSE output to render.

The main problem with routing nodes through to the VSE is speed. Depending on the noodle you have set up, it can take quite a while to build a single frame. Proxies can work well in that case.

Patel, RamboBaby, David thanks for your replies.

No doubt Durian will bring more NLE / VFX inspired additions to blender for dealing with DV, colour correction / look and such like.

Uhmmm, maybe I’m just missing the link between the “scene” and the node - is the “scene” I add in the vse == composite node output?
I tried, and it kinda works, but becomes messy when I add more then a composite node…

The way it works is like so:

  • In case you didn’t already know, Blender has the ability to have multiple Scenes in a single .blend file.

  • Now, let’s assume you’re working with multiple video sequences (it works with unrendered 3D scenes, too, but updates in the VSE are very slow). Create a Scene and for each sequence (and give it logical name). In each Scene, put together your composite network to adjust that sequence however you like.

  • Now create another Scene (I tend to name this one “Sequencer” or “Edits”) and punch over to the VSE in that scene. This is your editing scene. Add a Scene strip for each of your compositing scenes and get to editing. The only downside to working this way is performance. It’s nearly impossible to scrub when using Scene strips because Blender must fully render the frame before showing it in the preview. In order to improve performance, optimizing the compositor can help, but what Blender really needs is for Proxies to be extended to Scene strips (create low-resolution and perhaps opengl-only “quick-renders”) to allow for realtime playback in the VSE.
    Aside from the performance issue, this really works quite well. The key here is that the Node Compositor is designed to work as fast as possible on a single image whereas the VSE’s design is meant to work on sequences of images. Think of it as a low-level vs. high-level kind of thing.

What might be nice is a special class of scene strip, or a “convert to composite scene” option for image sequences and movie files. With an operation like that, Blender would take a given sequence replace it with create a new Scene in the .blend file that has a composite network with the selected sequence/movie as the input node. Basically it would be a kind of macro to automatically create what I’ve described above.

Oh ok - I didn’t knew this scene thing :smiley: sorry! I’ll try it asap. Anyway, I used the node compositor with profit, but I realized too that it misses a) resolution indipendence b) proxies (work on downsampled until you want the final render). Other then that, it’s a great tool - the only os node based compositor afaik.