Music Video Animation System

Hi,

today I want to share with you the very first real Testvideo of my Animation System I have been working on for a couple of years.
I think it was around 11 or 12 when I discovered Animusic. I was fascinated by their idea of self-playing instruments and wanted to do videos like this myself and vizualize my own music. The idea stuck with me over the years, when I discovered blender I tried to realise the idea with not much success. I got very excited when I discovered a script called MIDI-Driver for Blender 2.49b. It allowed me to load the MIDI files that contain the necessary data to trigger animations based on the played note and place the animations in the timeline. I played with for a bit for a while but with the release of Blender 2.5 and the change of the core structure unfortunately the script didnā€™t work anymore. There were a few workarounds to do the animation in 2.49b and import it then into 2.5 and above.
But it was cumbersome and being 14 at that time I was sometimes pretty lazy so I didnā€™t follow it any longer.

But the idea remained in my head over the years and when I got introduced to Unreal Engine a couple of years ago, I decided to try to develop a system with which I could create the videos inside the Engine.

I would do modeling, texturing an animations of the instruments in Blender and trigger the respective animation for the note when it would play inside Unreal. I had the raw data stored in my MIDI files: which note plays (Note number), how strong or weak it is played (Velocity), when it plays and how long it is. So in theory it seemed simple but as I didnā€™t have any real programming experience it became very quickly very hard.
I had a lot of setbacks until a friend of mine who is a software developer, helped me and wrote the essential parts for me to process the data in the engine. With the help of a lot of youtube tutorials I finally had built a combination of my music software and Unreal Engine. Some parts are triggered in real-time in my Music Software and part of the data is processed inside the engine.

If someone is interested in how it works I am very happy to explain it in more detail.

So last week it was finally there: my very first video!
I am very happy having finally achieved my goal and I want to share it with you now.

A few still images:



The models, the movement of the drum sticks and the drums when they are hit are done in Blender, the triggering and rendering is done in Unreal Engine. I used Megascans textures, fortunately they work quite well without having to manually UV unwrap the models. For future projects I will still do the unwrapping but for that first Test video they work quite well so I left the automatically created UV maps as they are.

There are already new ideas in my head and I am very excited to create new videos. Iā€™ll keep you posted as soon as there is something new to show. Until then enjoy the video and have a wonderful week!

Happy blendering!

4 Likes

So the sound/animation of the drums is triggered when hit by a stick, or does the stick follow the ā€œmusicā€? I guess Iā€™m just wondering what the catalyst isā€¦ what is the master behind the puppet?

1 Like

Hi

The Master behind the puppet is my music software ableton.

There all the trigger messages for every instrument component (Stick Left, Stick Right, Drum 1 etc.) are recorded for the full song. I then I play the song inside ableton and the trigger messages are send to Unreal Engine during the song.
Inside the engine the incoming trigger messages are sorted and then send to the individual instrument components. So each stick and every drum gets its own trigger messages everytime the animation has to play.

The sticks have some additional data that is generated beforehand and loaded into the engine to control the movement which is then processed when a trigger message arrives.

The song is rendered as a complete audio file beforehand and just plays along, so no audio is generated in realtime inside the engine.

Thank you very much for your interest and I hope I could answer your question.

the one issue about this is the start of movement, of say a drumstick, starts .before. the midi note. and is not always the sameā€¦ obviously goes for every instrument. it takes longer for a drummer to strike the floor tom than the snare. of course a human adjusts for this to stay in the pocket, but that doesnā€™t work with just using a midi stream to (live) software.

a script .could. be written to preparse a midi file, and figure out the start times for whatever animation is needed, perhaps even using alternate channels within the midi that would be fed to Unrealā€™s animation things, leaving the original midi/timing for the audio path. a ā€˜smartā€™ script could even adjust the timing based on the prior note(s) (moving from floor tom to snare, as opposed to repeated strikes on the floor tom)


Eathan

2m

Hi,

that sounds very interesting and definitely something I can think about for future updates of the system.
If you have the time Could you maybe go a little further into what you mean?

Right now a script precalculates the movement of the sticks so that the stick always arrives at the drum just before the ā€œHitā€ animation reaches the point where the stick is at its lowest point of the movement and hits the drum. It looks at the time the next note is apart from the current one and calculates the time, the stick has available to move from the current position to the next. I call that Preposition time. Then it places trigger messages into the timeline that trigger the start of the movement at those points during the song where the stick has just enough time using the preposition time to arrive at the next position.

So. just to make sure I understand what you mean: Lets say I have multiple times during the song that I move from Left Drum to center drum and have 2 seks to move from one drum to another.
So do you mean that the preposition time would not always be the same but would vary very slightly? Like one time it would be 2.01 sek. and another time 1.99?

Or do you mean something different?

hmmā€¦ okay, so you are using the midi note as the .start. of the animation. what is the animation time it takes between two left drum hits? letā€™s call that X.
the time between two center drum hits is probably also X or close to. same with two right drum hits.
but a left hit then center hit should be .longer. than X, because not only do you have the up/down strike motion, you also have the cross over to the other drum time. while you can vary speed to make up for that (and have to with very rapid notes) a human also takes advantage of other movements, like twisting the arm, or hitting the last L drum off center, to prepare for the C or even R drum hit. thereā€™s also a pause before moving the stick if the beats are slow enough. but if you watch a drum solo, where they ramp up speed, youā€™ll see some interesting movements in order to keep hitting in time. :slight_smile:

now, if weā€™re playing 16th notes in a L L C L C C R L (looping) patternā€¦ does the time to get between each drum and hit the drum, always add up to the same number? or are you using a pre-hit ā€˜pauseā€™ to adjust for that, etc. the objective is that the ā€˜hitā€™ itself should be ā€˜on timeā€™ with the midi, or >consistently< delayed so you can delay the audio the same amount to line up again.

what you mentioned (the 1.99 one time, 2.01 another) would be .in addition. to the above, as that ā€œhumanizesā€ the data (people are not machines, and do not play in perfect timeā€¦ even our greatest drummers). if your midi is already ā€œhumanizedā€ then you wouldnā€™t need to add that into your animation too. Ableton does have some humanizing settings in it, if those are turned on, donā€™t worry about that until later. the Animusic videos you referred to were not humanized, that music was created on the computer, and midi left with strict timing, but the timing of the original midi ā†’ animation was something they had to work on as well. :slight_smile: