I’m planning on implementing (and hopefully improving on) an algorithm to dynamically rescale music from inside a video editing software. It will be based on the paper “Scalable Music - Automatic Music Retargeting and Synthesis”, which you can download, along with a demo video, here: http://graphics.ethz.ch/publications/papers/paperWen13.php
So basically the plugin has to provide the following functionalities for the user when editing:
Select different parts of a video/song which should be cut out
Select different parts of a video which should stay synchronized with the music
Display an automatically generated segmentation of the music, possibly by means of colouring the waveform
(Optionally also edit the segmentation manually)
Create a manual segmentation for a video/song
And a few different other settings which should be no big problem
After all the settings are done, the plugin should be able to return the computed music signal and somehow import it into the project.
Can a plugin like this be implemented for Blender, e.g. does the API provide enough functionality for all this interactivity?
Advice would be much appreciated!