3 point editing in the VSE - the basic missing functions

Yeah, the easiest workaround is to use Resolve. :slight_smile:

Would you agree that if these 3 things where implemented Blender would be able to do the basic 3 point editing?

  1. Specify into what scene/sequence media from the File Browser should be imported. (Import into a different scene than the current)
  2. Specify what scene the sequencer should use(outside the current scene). (Use sequence from a different scene than the current)
  3. Specify what sequence window the Timeline view should link to/control. (Individually controlled playheads)

I believe with these few functions implemented then the Easy_logging script could be adjusted, so there would be no need to switch scene all the time, but control everything from the same scene ui. Am I wrong? Am I missing something?

I also suggested a scene based ui color. So when you switched to the source scene the entire interface chages color. You can scrub a source in the file browser iirc. Perhaps a file browser could be the bin view to select a source clip bwfore triming in source scene. But source/record switching does suck. Maybe best thing would be to improve movie clip editor to accomodate metadata and audio view?

Heres an example:


Why do you prefer the Movie Clip Editor over a Sequencer linked to an offscreen scene containing the source strip?

Because my suggestion has at least a snowflakes chance in hell. Seriously Iā€™m only suggesting an addition to existing tools not a total depsgraph do over. There are real issues with scenes coexisting temporarily.

Good to know. Thanks.

Sorry for being terse. Its just that Iā€™ve been having this argument about the vse as nle for years and years. Hence my handle :wink:

My aim is narrow down the most simple way to make Blender a functional 3. point editor(what specific UI/API functions are missing?) so we can write up a proposal for rightclickselect.com or set up a bounty like the one they set up for OpenToonz.

After all these years of having a focus on the subject did you make a proposal like that? Do you think that is a dead end too?

For me the main advantage Blender has over any NLE out there, is that it is scriptable(as long the API functions are implemented for what you want to do). No other NLE has this option to my knowledge.

You should dm Nikos from easy-logging addon mentioned above. He has spoken to Ton already and I believe that there might be some work proposed for b2.8

I already send him a pm, but no reply yet.

Just for reference Iā€™ve made two screendumps.

One using a sequencer as source:



This one needs the Timeline to control the playhead of the source Sequencer, and the source Sequencer should play material from a different scene(containing that specific source strip) than the current scene.

And one using Movie Clip Editor:



Here the Timeline Window should lock to the playhead of the Movie Clip Editor. The Timeline window could also be used for setting In and Out points. And the playhead of the Movie Clip Editor should be able to play unlinked of the current scene.

The advantage of using MCE seems to be that it already can play clips outside of the sequencer.

Yes thats true the MCE loads clips as scene specific objects called Movie Clips, they have start frame and offset defined in the MCE for use in the compositor, but not the VSE. Which is mostly good, but it would be better to set additional metadata there like in/out points and keywords/color correction/volume adjust etc. Then it would be useful to time and trim the shots in the VSE (new ā€œedit-timelineā€ relative metadata) for use in the compositor of another scene.

This setup with 3 duplicated windows and User Preferences > Interface > Global Scene unchecked actually allows you to do 3 point editing from the ā€œsameā€ UI/without changing scenes. Only really annoying part is the big fat edges of the windows. So the request would come down to a windowless modeā€¦ :slight_smile:

This way the sequencers are playing local sequences and the timeline controls also controls sequences locally so it is actually possible to control ā€œsourceā€ and ā€œrecorderā€ individually. Easy Logging is providing the In/out functions etc.


(One small annoyance is that Blender does not save the positions and sizes of the windows correctly on Windows)

Youā€™ll hate this but I found the same thing, about the window layout, years ago. Because it relies on the user moving floating windows around I never pursued it.

This 3 window layout for me is a proof of concept of a working 3 point editing solution, without having to change UI/scene back and forth between source and project scenes. For me the Movie Clip Editor is delivering a much slower playback compared to the Sequencer video playback, and the latter seems to better integrated in the sequencer.

So if a scene selector combobox was added to the Sequencer, the Timeline controls could be linked to individual Sequencer windows(local playback) and drag and drop was added from the File selector to the Sequencer Video Player, then I believe the rest of the functions could be added by cherry-picking functions from all the VSE scripts around(including ripple functions).

I think itā€™s wise, there are already some opensource project for editing like shotcut or kdenlive that are far more advanced for video editing than the VSE. The good thing with the VSE is that it helps a lot dealing with CG animation projects, but not that much as a true editing software.

If I had to make a big movie mixing video and CG I may use another software for editing and use the VSE only to import EDL and shot management, review ectā€¦ Except from 3 point edit there are many lacking feature to make VSE complete for editing.

Devā€™s are already struggling with making blender a good 3D application, they canā€™t make every module (VSE, text editor, Compositor) as good as a dedicated application. Like with the compositor, itā€™s already very powerful and very useful as it is, but a dedicated application like Natron is far more efficient on bigger projects, when you have time to switch between applications .

Itā€™s very interesting to see how you set up storyboards, unrendered and rendered side by side. And by using the channel function all of them are playing in sync with the audio. Looks like a clever setup.

I guess you do most of your timing when editing the storyboard (in sync with audio), or do you use the unrendered scenes for that?
Are you using the unrendered strips linking to scenes for the middle screen, or are they rendered and inserted as video strips?
What is your workflow?
What tools/functions are you missing to ease your workflow?

Hey thanks !
Indeed most of the editing decisions are taken at storyboarding stage (done in storyboard pro), we then just extend or reduce strips a bit in the end when animation is done.
Linked scene can be useful when setting up characters and cameras in the shot, or when we want to go away from the storyboard and try different things without intermediate rendering, but thatā€™s not a common case.

Iā€™m currently reworking the whole thing to be more efficient, and not being forced to link scenes into the edit. That was useful at that time to be able to open the shots from the edit easily , but that also force us to link every assets in the edit file and then increase loading times.

Now each shot is encapsulated into a metastrip and itā€™s possible to switch between full render, openGL or playblast render. The filepath to the shot .blend is a custom property of the metastrip so there is no need to link the scene in the edit anymore. Also everything can be rendered from the edit file, locally or into the renderfarm (CGRU/Afanasy).

About missing tools, with python everything is quite doable for what we need, it just take time to add them. Iā€™d like to find a cool workflow to be able to do also storyboarding inside blender . Itā€™s quite similar to what we have now with setting up shots, but when storyboarding you want to try many things and that creates a lots of shots that arenā€™t needed in the end. So it may work a bit differently.

What I found would be great to add but canā€™t be done in python is more into the color grading tools. Maybe a few more blending modes for strips : having screen , overlay, dodge or others can be handy alongside add and multiply.
Iā€™m using the wipe effect to create some vignetting or to add some color blends to shots in the grading/finishing stage. Itā€™s not perfect but that works , with a few more blending modes that would be awesome.

Sadly, when adding some Adjustment layers to color correct or these kind of vignetting effects, the sequencer playback isnā€™t realtime anymore, even when using proxys and only one color modifier. That would be awesome to improve playback speed under these circumstances, but I guess it may become too much work for a software thatā€™s supposed to do 3D in the first place .And from what Iā€™ve heard first VSE implementation was made in a hurry without a strong design in mind, thatā€™s why even small things can be hard to add even now.

Having more strip modifiers and less effect strips can be another good thing to do, as now everything is a bit mixed between effect strips, the filter panel and modifiers. There are a few thing like this, that are not blocking someone from doing something, but that would push image manipulation a bit further but staying in the realtime side , as opposite to the compositor that is for strong image manipulations.

For me VSE is a great tool for 3D animation and is quite complete (as long as you can do a bit of scripting to automate things) , in these low budget / small team projects we are using only blender and we donā€™t loose time switching between softwares.
Maybe under different circumstances, like a feature length movie or a VFX movie it would be easier to use a true editing software and use the VSE only for sequences management or quick review. But thatā€™s another scale of project, so having more dedicated softwares is logical.

If I need to do pure video editing I prefer using another software like shotcut or kdenlive, they are more well suited for that task (at least you can play/pause using spacebar instead of alt-a) . Theyā€™ve got shot management, fast playback and all the tools needed for editing. VSE in the other hand has a few unique feature that makes 3D animation a breeze. For me itā€™s two different tools with different purpose.

Sorry for that long post, I canā€™t stop myself when Iā€™m on a cool subject, I think I should go to sleep now :smiley:

1 Like

Thank you for sharing your insights. Lots of good info here.

When youā€™re thinking about doing the storyboards in Blender, are you thinking grease pencil or 3D previz?
If youā€™re thinking 3d previz would you use Adding scene/camera strips? In this case, I read that it is not advised to use the same scene for the sequencer as the scene you use as scene strip, so this is another case where the option to let the Sequencer/3d view access scenes outside the current scene, could be useful, if you want the edit cameras in the same UI as the sequencer, that is.

This way the 3D View could work as a ā€œsource monitorā€ locally linked to a Timeline window with In/Out markers and insert/overwrite into sequencer button.

On the missing color grading options, maybe exporting an edl and conform the edit in Resolve could be a workflow?

For a vignette in the VSE I just use a mask, they are very fast to render in VSE preview. Also I donā€™t get bad frame dropping when there are many modifier layers. I am surprised to read that you are having performance issues with proxy playback, that should not be the case.

A digression: now Stoyboard Pro has been mentioned, I just noticed some guys are working on a Storyboarding software here(itā€™s very young and many things implemented in the UI are missing their functions): https://github.com/wonderunit/storyboarder