Is there some plugin/extension to make blender a better compositing/editing program?

Well, that’s the question.

I see blender as a huge powerfull program, love it, but mainly for simulations/compositing/editing.

So i’m guessing if there is some cool plugins/extensions that are specific for compositing and editing.

Or maybe some optimized build for compositing and video editing with some more usefull nodes in the compositor.

Cheers and thanks.

Standard build - COMPOSITING NODES. Much coolness to be had there, just not as simple as the sequencer effects.

I know composite nodes, maybe i did not explained my self well :slight_smile:
I mean some plugins/extension for the compositing nodes, also for the sequence editor, but someone told me a link with some plugins for the sequence editor, so i mean now sime plugins for the composite nodes only :slight_smile:

Here is the link for the sequence editor plugins :


@ the OP: any particular features or effects you don’t see but would like to? You’ll have to qualify “better” in terms more concrete.

Better would mean works like After Effects.

No, you are pretty much stuck with whats already in Blender.

Yeah, that’s it Atom…well thanks by the way :smiley:

Hng on why is AE better? Workflow is different, it really depends on what you need to do. Sure there are more tools there, but I can get a long way with just Blender!

Have you seen the new compositing book for Blender, there are threads here.

There really are so many reasons why After Effects is better than the VSE. Have you tried it?

Text Effects alone sets it apart. What about card dance, expressions, sub comping…the list goes on and on.

Blender does sub comps (after a fashion), and you can simulate many effects with actual 3D. I got sick of buying pluggins for AE 9 years ago. BTW Atom great work on your script. Do you think that B 2.5 will get functionality closer to AE or an Adobe studio type package?

Well I for one would love to see render layers output to GIMP and GIMP be able to feed back into the compositor and sequence editor directly. I like all the great filters and scripts that GIMP has to offer and these would be a boon to Blender comositor.

If you think about it most elements of After Effects are already in Blender. What I imagine is a way to create a special kind of 2D scene, like a subcomp, which can be mapped to a texture or a sequence in the VSE. This subcomp would have next to other things a timeline, masking, vector animation and the paint tools. Everytime I have to use After Effects for some blurred text animation or simple 2D stuff I think about how much of this is already available, just not unified. Having node-based and timeline compositing in one package would be great for motion graphics.

Reading the AFX touters’ comments here makes me think they haven’t really put much time into developing a skillset with the Blender Compositor and VSE. If you’re happy with what AFX provides, that’s cool, but don’t expect Blender to be a carbon-copy of AFX, even though it can probably do many of the same same tasks (I’m not that familiar with AFX except by rep), just with a different workflow paradigm.

For example, “node-based & timeline” integration can be done by using the Scene input option in the VSE – if a Compositor node tree in a Scene is enabled for output (Do Composite), then its output can be placed in the VSE like any other strip element, as a “Scene.” The same Scene/Compositor output can even be duplicated and the strips time-shifted relative to one another. Every .blend can have multiple Scenes, and thus multiple Compositor setups, all fed into the VSE for “final assembly” with the VSE’s suite of tools for manipulating the strip blending and transitions between.

Putting the VSE output through the Compositor isn’t possible afaik, but why would you need to, since incorporating any movie or sequences (i.e., the usual “strip” content in the VSE) can be accomplished using the Input/Image node.

By way of an example, all the text overlay effects and a good deal of color manipulation in the final sequences of this montage of scene snippets were accomplished entirely with Blender’s Compositor (including all the scene transitions, which could just as easily have been done in the VSE). In many cases all the elements were incorporated in a single .blend, using multiple Scenes to provide independent rendering environments for the various elements (for example, a perspective camera for the models and an ortho camera for the “flat” text overlays). In the final sequence, the three rows of rotating objects were composited together from three different Scenes, each containing a duplicate of one “master” row but with the timing of the rotations and color shifts modulated, and a single camera linked between Scenes to preserve the perspective in the composite.

The tools are there.

First off, I don’t want to carbon copy anything into Blender. I want an improved workflow for motion-graphics related tasks. This is different from compositing so maybe I am in the wrong thread.

Thanks for posting the example and the workflow you used. It is new to me and opens up possibilities. This is exactly what I meant, the tools are there, but in my opinion they can be made to work better together. The text in your video is a good example for something that is often needed. But if you want to add more effects, like in Ben Dansies video, I assume the setup gets convoluted and inflexible. The stretching of the Panels, the slight animated blur. I am not a fan of Adobe Products by any stretch, but this is where AE shines. And again, the tools are mostly there in Blender. I have to admit that I have not tried to recreate this in Blender, so to provide something usefull I will look into it when time permits. You seem to have some exprience in this regard, any input would be appreciated.

PS: Just to put things more in perspective, I am a freelancer and I use Blender and other OSS Tools on Linux. Believe it or not but I do 90% of the stuff I make a living with using software that was given to me for free. The other 10% is motion graphics related stuff that I have to use Windows and AE for, which I detest. So please do not think that I don’t appreciate open source, want to turn Blender into something else or whatever. I only want to give my opinion on the challenges I face and what would enhace the workflow for me.

Someone here (at the forum) has setup .blend files that provide you with specific effects. The scene is arranged so that all you have to do is plug in the texture. Used as a mapped plane object (for resizing, overlay etc.) or directly into the compositor. I guess you could then append these effect scenes to your master comp as needed.

BUT blender travel mattes s u c k! Theres just no other way to put it, there are good reasons and tedious workarounds, but there ya go.

Them’s fightin’ wurds :wink: :D. Really, though, please expand on what you mean by “travel mattes,” as I know the term (“traveling mattes”) from pre-digital filmmaking and it’s a huge catch-all for any type of articulated (i.e., not static) image-masking-and-compositing techniques, some of which Blender can do very well, others less so, but it all depends on what is specifically needed. A simple wipe-dissolve uses a traveling matte, as do higher-tech methods like blue- and green-screen, though with digital tech a lot of the actual “matte” work (i.e., creation of intermediate masking elements) can be eliminated for these techniques.

Sorry i mean a multi nodal animated shape. Hooks suck for this. But sure simple shapes are easy but roto is harrrrrdddd.

I’ve only had to resort to roto in some very small instances, and it is complex to do in Blender, but that’s not surprising given that the Compositor is, as I see it, designed more for a tool to use in conjunction with the 3D functionality than strictly 2D as used in articulated mattes and such. But with some creativity you can do a lot that provides an alternative to roto work, such as Render Layers, use of specialized Materials for matte black and pure white rendered mattes (similar to what was done in the early days of motion control with movies like Blade Runner only using virtual instead of real models), using Matte nodes and other “keying”-type matte generators, even such “low-tech” stuff as slopping some virtual gobos into a scene. Thing is, mattes are nothing more than masking techniques, and they can be accomplished in many ways, the variety of which expands as you get to know more about the Compositor’s capabilities.

Sorry I meant key shape, for transfer changes to background images. I hate RVKs. I wonder if 2.5 with the new animating changes will improve this. I would really prefer linear key frames and editable curves. Of course 3D can behave just like 2D, it’s just a matter of application. It would be great to to edit changes realtime on the output view.

And I think that John Dykstra may have beatten the Blade Runner guys for Mo Co on new hope several years earlier. But I’m more interested in flat compositing for video.

Thanks for the hint, I tried searching but didn’t find it. Do you remember any keywords that would help me find this?

I am currently working on a project that I will wrap up in one or two weeks. I will then illustrate one of the situations where I have to resort to AE. Maybe some better workflow will come out of it.

looked hard but couldnt find it. I’m sure it was someone offering preset .blends for a range of functions that you would just plug into your project by append. It was hosted at a private web site. would be a great idea for a thread, a repository of basic compositor or VSE effects/setups. Kinda like the node thread from ages ago.