Sound Bake & MIDI Sync Animations

All done with Animation Nodes and a few base meshes, with Instances:

I am now waiting for a MIDI Node that would read a MIDI file, create a series of controls with F-Curves derived from the MIDI notes and tracks so I can play instruments. I can do this with the old Blender 2.49 script, but this would be a great addition to AN.

Here’s a picture of two keyboards playing chords and melody using this method:

I may render this, it’s the first guitar solo from Comfortably Numb - I played this on my TMK88 connected through an iRig into my iMac and input into Reason DAW.

Cheers, Clock. :stuck_out_tongue:

Update! :wink:

I have now written a Python script :eek: that reads a MIDI CSV File - yes for now I have to convert the file to CSV format and there is a very nice little programme on the 'net to do that…

The script examines the MIDI file and creates an empty for each note played in each track and keyframes the movement of the empty in line with when the script sees “note on” and “note off” events in the MIDI file I have tested it quite a lot, but it is still a prototype and really needs to be built into AN, or the UI, so the user can select the CSV file from their machine. I am currently testing my AN MIDI node tree on this new script’s output to make sure it all hangs together.

If anyone can point me in the right direction to build it into the UI I should be most grateful! :eyebrowlift:

Here’s a screenshot of the control empties from Procul Harem’s “Homburg”, what I played on my TMK88 though blah, blah, blah. The selected one is the Hi-Hat (I think…). :eyebrowlift2:

and a bit of the MIDI CSV File:

Cheers, Clock.:stuck_out_tongue:

Awesome work @clockmender I’m looking forward to see where this goes and hope that some musically inclined Blender artists join your efforts to bring a midi-sync signal to generative animation!!!

Do you have the Python script that converts the MIDI into CSV? I want to give it a try with Sonic Pi 3 and AN!

I think it is written in C++? you can find it here:

It works well for me and produces a clean file that Python can read easily and quickly - being a CSV file you can easily see the structure of the events in the MIDI file. Writing a script to read the generic MIDI file will probably be a huge pain in the ass, whereas Python reading the CSV file is easy, with not very much code. My script made and keyframed a multi channel CSV file with 102 separate control empties, total song length 6350 frames at 24fps in under 2 seconds…

Cheers, Clock. :eyebrowlift:


@JWise - thanks for the kind comments sir! - hopefully there will be enough interest to get this project really moving.


OK, so I have now got the script to:

  1. Read and action the note velocity as well as just note-on, note-off.

  2. Read and action the Header information, Tempo, Channel Names and PPQN (Pulses per Quarter-Note). it makes up channel names if they do not have a channel “Title”.

  3. Calculate Beats Per Minute (BPM) and bar length.

  4. Have a variable, user defined “easing” to slope the F-Curves so the piano keys (for example) don’t just bang down,

  5. Take care of the situation where a note is played immediately after another play - so the timings for Note-off[event-1] and Note-on[event-2] may be the same - this caused Blender to mess up the F-curves, so I sorted that out…

  6. Create the Group names for the controls, so they can be fed straight into the AN Node Tree.

  7. Set the frames precisely so the end of the MIDI sequence is exactly in line with the Sound - well to the nearest 0.0001 or a frame anyway.

I am going to keep testing with ever more complex midi files to see if I can break it. the nest step is to vary the tempo (and therefore BPM) throughout the MIDI file and action this in the animation. The PPQN won’t change but the Pulse time per Quarter-Note will (130bpm = 461538 micro-seconds / quarter note as a “for instance”) Any tempo changes are stored in the MIDI where they occur, so I should be able to read this and adjust the frame translation from the MIDI pulse value.

Cheers, Clock. :stuck_out_tongue:

Update: :stuck_out_tongue:

Opps - Youtube screwed up my video…

I’ll try again later.

Cheers, Clock.

OK, here is a video for review:

Let me know what you think to this please. :eyebrowlift:

Cheers, Clock. :stuck_out_tongue:

Here’s an example of a set of controls built by the script:

I have changed the “Empty Draw Type” to “Single Arrow” and checked on the “Display Name” for each empty (all done in the script) You should also be able to see the Group Name - “Channel_2” for the selected empty. The MIDI file did not have Channel Names, so I add these in the script using “Channel_” plus the MIDI channel number, otherwise it takes it’s name from the MIDI file.

The top set are the drums, the “missing” lines are controllers, that we do not need to animate the keys. You can also see a section of the F-Curve for the selected control - this has near vertical sides as the notes are very short in the MIDI file, particularly for the drums. Longer notes means I can use a larger “easing” values to slope the sides of the F-Curve a little more. You can also see that the Keyframes are not on an integer frame number - you can do this in Python, but not in the UI, or should I say I haven’t found a way to do it in the UI yet…

So now I have got all the various bits working, now for some serious testing on many MIDI files to try and break it. :eek:

Hope all this makes sense. I am going to try to build the script into an Animation Node, with Input boxes for all the parameters in the script - this I have never done before so it may take some time. :spin:

Cheers, Clock. :stuck_out_tongue:

I am really struggling to make a new node with my script in it…

I think it is going to take a while to get my head around what is allowed and what causes Blender to kick AN into touch.

Has anyone got any pointers, has anyone done this before? Help gratefully received.

Cheers, Clock.


I have made a Node - that creates controls with F-Curves derived from a CSV format MIDI file, I am very pleased with myself. :D:D:D

I will continue testing and error trapping before I declare it “Fit for peer testing”, thence onto “Fit for purpose”. :eyebrowlift:

Cheers, Clock. :stuck_out_tongue:

I think I said some time ago that you are doing wizardry here and I can only repeat that :slight_smile: Awesome advanced stuff. Keep it up!

Why thank you - I am building a new node to calculate the indices as a one-off operation… may take some time as I haven’t a clue what I am doing.

Cheers, Clock.

OK, so now with a great deal of help from Jacques I have made this new node. To get it to only execute once in the project, I have put it in a new node tree and used Group Nodes to pass the index file from tree to tree:

And the animation Node Tree (or part thereof):

Cheers, Clock. :smiley:

So here is my latest project: :evilgrin::evilgrin:

And a closer view:

They all “gesture” to the tune “The Entertainer” by Scott Joplin, each puppet represents a note on the piano and they are arranged in Octave Sets rising from the lower tier, note C on the right as they stand, B on the left. I must apologise to the late Mr. Joplin for this project! :yes:

I wil render it to a video, once I can stop laughing and transfer it to my server for processing. :eyebrowlift:

Cheers, Clock. :stuck_out_tongue:

PS. Seriously - I can now animate almost anything using my new MIDI Nodes.

So now I have revised nodes and get a much better performance: :eyebrowlift:

Two different tunes in the same file, two setup groups and one execute group. Revised nodes work well, although I am currently working on a way to get rid of the control objects and store F-Curves in a new Class - wish me luck, this stuff isn’t easy! :no:

Once happy with these nodes I will put them on GitHub, for peer testing, if you are interested that is :stuck_out_tongue:

Cheers, Clock.


I forgot to say this node tree runs at 1.2ms - 149 notes played over 5760 frames…

Here are three manuals playing Toccata & Fugue: :eyebrowlift:

So why have I bothered to show this? You may well ask and the answer is because this piece has hundreds of tempo changes and several key signature changes in it. I have made some experimental code, that I am not prepared to post on GitHub yet, that takes all these tempo changes into account. There are some errors that I still need to iron out, but I am getting there! The piece is 12 mins 52 secs long and has 111 notes played including the pedals - I need to make some pedal meshes when I have some time, right now I need sleep. :eek:

Cheers, Clock.

PS. Note the node tree execute time of just a smidge over 2ms…


Here’s just a small section from the MIDI files of all the tempo changes:

You can tell I am happy with life and making good progress, because I have gone silly again! :stuck_out_tongue:

Here is my Erkinator (Cross between an “Erk” and a Terminator - long story). He has featured in previous projects, generally up to no good, or playing guitar, playing drums, or being rude, etc. He is watching the organ play Thomas Arne’s Gavotte, well one of them. I have used my new MIDI nodes to achieve this.

Anyway, I am going to “cook” the animation and post it online over the next few days. The piece is only 2 mins long and I played it into Reason - I now have version 10, sooooo good. My Mk1 MIDI Sync system created the necessary controls to drive the three manuals (“Swell”, “Great” and “Choir”) and the pedals. There are 75 different notes played across the four channels and it can played with just two hands and two feet.

Here are some of the Nodes:

My next challenge is to get Erkinator to play it - this may take some time and considerable brain power. :yes:

Cheers, Clock. :cool:


If you are wondering why there are no “Stops” on the organ - it’s because I has to pull all the stops out to get this finished on time… sorry old English joke!

Latest Test Video - Erki is enjoying some “leaves” :eek: whilst watching the organ play itself, next major goal is to get him playing this tune:

This is a good test for my MIDI Nodes - very fast play rate and multiple channels. Thanks to Jacques for the Sound Bake Node that was used to make the columns on top of the organ. :eyebrowlift: Oh, and thanks to Thomas Arne for the tune.

Cheers, Clock. :smiley:

More Progress,

So now I have the alternative method working correctly - this method uses a new Blender CollectionProperty Class to store F-Curves in a list that the AN node tree can use to animate objects.

This test piece uses the Curves to rotate the keys and also to raise the string dampers. I have introduced a new factor - the “Spacing” that my node uses to space out consecutive notes so they look like they were played twice - yep if I had two consecutive notes of the same value (C4 for example) in the MIDI file, the key was not released in the animation before, so now I can separate them by a number of frames.

I am now happy both systems are working, just need to do some more testing ad bug checking. :yes:

I hav also realised that I can use this system to animate things, not necessarily to music. So I could create a MIDI file from Reason where I press a key every time I want an object to tilt, for example, then use my nodes to animate it rather than keyframe what could be hundreds of sequences - I just “play” the sequence on my keyboard and my nodes do all the animation in one quick process… Now there is a thought for complex animation sequences. :stuck_out_tongue:

Cheers, Clock. :smiley:


Here’s the new node:

And the animation nodes: