We're In The MIDI & Audio & DAW Real-Time!

Well, hello again to the WIP section and Happy New Year! I suspect some of you have been wondering what I have been up to lately, or you may just be pleased I have been quiet for so long. :rofl:

I, along with my new French friend, yes we have both got over Hastings and Waterloo :crossed_swords: and are good friends across the channel now, have been working on MIDI realtime in Blender, so I thought I would share where we are at just now, so here are some images and some explanatory waffle:

This one shows a strange spiky thing, which turns one way if you press key B7 and the other way if you press C8 - not at all thrilling really, but it all happens in Realtime from my connected MIDI keyboard. The keys on the virtual keyboard change colour as you play them using my Material Nodes in AN.

This one shows the latest Realtime system in operation, driving my virtual keyboard from my real one. We have added a function to automatically create the keyboards in either 88, or 61 key flavours. We have also added a system to rename a set of meshes so they match your keyboard to make key assignment dead easy.

This one shows some of the new nodes driving a virtual keyboard from a MIDI file, I have re-built my system from a little while ago, so it is easier to use, more understandable and more efficient.

This one shows the full node tree from above, with the two methods of reading MIDI files, i.e. controls, or baked curves. Both now run much better than before, not that they were crap you understand, just could be improved.

This one shows a virtual drum kit being played by a real keyboard, this is rendered, we hope to be able to run this system in 2.8, with EEVEE, once it is more stable - “it” being Blender 2.8…

This one proves I am not b*********g and can really do this stuff. That’s Blender and Reason being driven simultaneously with one keyboard and one iRig interface. :crazy_face:

The new system is all written in Animation Nodes and comprises 10 new nodes at the moment, this may increase as we add more functionality, like we have our sights set on Realtime Audio as well. :scream:

So, I shall keep you updated as progress is made, the system is not fit to made available yet, but we hope to have it on my Old-Git-Hub at some point later on. Also, just now you have to install Python Modules and other stuff into Blender’s Python directory and other places, all by hand and it’s a pain in the rectum to do this. We hope to have a complete install script at some stage. :woozy_face:

Well, that is why I have been so quiet of late, so much work here an many things I just did not understand at my old age, but I am getting there with all this new Python Modules Black Magic stuff. :mage:

Cheers, Clock. :beers:

14 Likes

Some Progress, we believe we now have a system that will read multiple MIDI Interfaces, just testing that to destruction as we speak. We have also started work on the Audio System, that will use Audio input to animate in realtime, a trial system is working in France, I hope to have it tested soon on my iMac.

Being slightly weird, I thought to myself; “Maybe I can animate a mechanical device using my MIDI keyboard”, so I did (maybe I should try with one of my human characters as well):

The digger arms are controlled by 6 keys; C5, C#5, D5, D#5, E5 and F5 to raise, advance, rotate, etc. the bucket, driving through an Ik chain with my own Bone Nodes (If I can say that without you all tittering like small school boys, or school girls :rofl:) Here is the node tree:

You will note that I can press multiple keys to achieve compound movements and the harder I press the keys, the faster the bucket moves, this uses the Velocity value from the MIDI input.

We have also got the MIDI Sustain Pedal and Modulator Inputs working, so they persist as you play notes, etc. This meant we had to make a new node to store the parameters from the MIDI Control Channels, we may need to add further options later, but it’s a good start:

So here the blue curly bit “grows” with the Modulation Wheel input and the green cube rotates as the Sustain Pedal is operated by my foot. :foot: The two text items are fed with the values form the Parameters node. The yellow curly bits grow as the notes in the 5th Octave are played, this uses Bezier curves with Bevel Objects and the End Value is animated by the MIDI velocity for each note played.

Let me know if you are interested in using/testing this system and I can let you know what you will have to do to get it working, soon we will be able to load these nodes to my Old-Git-Hub, on the usual “User Beware” principle for new code.

For now it is only Official Blender and Animation Nodes releases, we hope to have Blender 2.8 and AN 2.1 once they are both stable enough together. I know Omar is working very hard on AN 2.1 and is doing a great job there.

Cheers, Clock. :beers:

PS. All the time Blender is running with these live animations, I am also running Reason DAW with a Moog Synth (pretty CPU Intensive) to make sure the system works fast enough for live animation to live music playing, even on my oldish iMac. one limitation appears to be very complex meshes and over-use of Sub-Division Mods, but I guess you don’t need complex meshes for these types of animations.

Well , I just thought you were catching up on your “honey do” list , out flying around :airplane: and spending quality time with Jack and Jim.
This looks fantastic and what a great use of YOUR nodes too :notes: :musical_score:

1 Like

Thank you very much and doing this keeps me off the streets where I might get into all sorts of trouble. :gun::oncoming_police_car::policewoman::wine_glass::woman_cartwheeling::woman_judge::woman_in_lotus_position:

I have got some work done on an Audio Init node:

Reads the inputs/Outputs from my machine to populate the “Choice” boxes and other sockets on the node, not got as far as reading the stream yet though. My old brain gave up for the night after this progress and before I got the input read.

Cheers, Clock. :rofl:

3 Likes

I now have output from “sounddevice” module to drive animation of two cubes, one for each channel:

This one is reading input from my built-in mic, this means I need people chatting away near my Mac to work, I will try feeding music in once I get home and can connect my phone to the audio input port of my Mac.

Next stage would be to get the sound and split it into frequency ranges, but I have no idea where to even start with this objective… :confused:

Anyway this is progress for my ageing brain! :brain:

Cheers, Clock.

1 Like

Now I have an Amplitude Splitter:

Splits the signal into intensity chucks to drive mesh objects, etc. Latest version I am working on has a “Significant Figures” input to stop me getting 28 places of decimals!!! I don’t use round(num) because thats no good for numbers larger than 1, or 10 for example. this helps to smooth the animations. I am now running my animation for this on “Always” => 0.04 Time Diff so it mimics 25 fps animations.

Cheers, Clock. :beers:

PS. Thanks for the Likes!

This is mental and very cool - though I’m not sure if and how you could use it in a project to create a video or interactive art piece (like - game without the game kind of thing).

1 Like

I’d really love to see this in action.

1 Like

This is awesome, I was always wondering if this was possible. What’s the interface between system MIDI devices and Python? Is there a default Python package which gives you access to the MIDI devices and their state?

Awesome work. I can’t wait to see more.

2 Likes

Also I’m sure there are plenty implementations of FFT available in python, it seems like you could probably get that to work for getting frequency domain info about the input signal.

1 Like

Thanks for the messages, I respond in full later, just off out for lunch - a perk of being retired! :shallow_pan_of_food:

Basically I use PyGame, but only load pygame.midi in AN for the MIDI and Sounddevice for the Audio, you need to get and load the Python Modules. These must be loaded into Blender’s Python, not System Python, and you need to load pip first for B 2.79, then get wheel files bahblah.whl from PyPi for these to work. I have done this on my Macs, but my colleague in this project has it working on Windows and Linux.

I will video it working on my Keyboard and Guitar in a few days. :camera_flash:

More details later!

Cheers, Clock. :beers:

EDIT:

21:20 local time: Not been home too long, far too tired and “ratted” to explain all this now, I’ll make a better effort tomorrow… :crazy_face:

1 Like

The original brief was to create a system that visualised MIDI input in real-time. So I play the old TMK88 and objects animate in real-time. The question of how one might do this was posted in GitHub Animation Nodes branch before Xmas.

I responded and the OP there and myself have solved the problem and got a system working. The next part was to also do this with Audio and again we have got this working now, but not fully developed in either case. I think the OP on gitHub wanted to project animations whilst playing live in his band.

We aim to have this as a fully self contained package with an install script as some time, but for now there is some pain to endure to get any system like this operational.

First thing is to install pip in Blender’s python directory, so first you have to find it, on a Mac you just go to the Blender App and click “Show Package Contents” and then go down to the Python/bin directory, where you will not find pip. So go here and get the required file and save it to your disc somewhere sensible.

Then in a terminal type the following:

./python3.5 [path to]/get_pip.py

With your chosen location as the [path to] bit, now you have pip! The ./ makes sure you use the local python command, not your system one…

Now you will need PyGame, get the wheel file for your operation system from PyPi and save to disc.

Then install using this:

./pip install [path to]/pygame wheel file

This will install PyGame in your Blender python site-packages directory. Repeat procedure for SoundDevice, it is available for all flavours here.

Note! You cannot load from source to the Blender python directory as there is no python.h file in there, so wheel files are the only option, other than loading the same version of Blender python as a system python and then trying to install the source files there, then moving the site-packages files across, even more pain than that is worth, so I don’t recommend trying.

Then you can either wait for us to publish our work, once we are happy with it, on my GitHub page (we are not entirely happy with it yet, so it is not there yet), or you can read the manuals and do it yourselves. That comment I know is not too helpful, but please understand we are not going to publish anything before it is reasonably tested and de-bugged.

I am going on holiday for three weeks starting 22 Jan 2018, so our nodes will not be on GitHub before I get back home. Unfortunately, Mrs. C will not let me take my iMac, TMK88 and one of my guitars with me to Vietnam, so I cannot complete the work before I get home again.

There are still some issues, for example:

  1. If you stop the PyGame MIDI “Read” function and then try to start it again, it will not work, so you have to exit Blender, then it will work next time.

  2. If you stop SoundDevice “InputStream”, then restart it, all seems well, but when you quit Blender, you may notice that Blender just hangs sometimes for a few minutes, sometimes for longer than it takes me to get bored watching it, so I Force Quit and that cures things for next time you start Blender.

I am not sure if we will be able to solve these issues, but it is still workable, as a simple shutting down of Blender cures all ills. We have also noticed that making complicated mesh objects to animate is not a good idea if you want them to animate real-time, so don’t do that.

I have also written a node that translates MIDI note id’s into guitar string/fret positions for both 6 string and 4 string bass, this will be included in the package, once all testing is complete.

So this is where we are at to date. I hope to be able to publish the nodes before end Feb this year and will post a little video from my phone of this working, if I get time before I go on holiday, Mrs. C has lots of jobs for me before we go away by the way, how surprising is that?

Cheers, Clock. :beers:

I think this is all I will get done before my holiday:

So, the new button on the “MIDI Load Keyboards, etc.” Node now builds a fretboard according to the size you put in and the scale for the nut relative to the bridge. I then use this control my little “Pinky” so it covers the correct fret for the correct string for notes I play on my keyboard. I am then going to use the Audio system to vibrate the correct string to the sound being made!

The node doesn’t add the strings yet, that’ll have remain on my ToDo list for a while longer. These are to be animated with Hook mods, these are added now, but not animated yet.

Cheers, Clock. :beers:

PS. Working out fret positions is easy - you take the “bridge to nut length”, multiply that by the 12th root of 0.5, this gives fret 1 from the bridge. You then repeat for the length to fret 1 to get fret 2 and so on for all your fret locations. I said it was easy! :joy: :rofl:

Thanks for telling me the fret-positioning formula! I knew there was one but couldn’t work it out myself!

1 Like

Greetings from Hà Nội :wave:t3: yes it’s not well publicised, I thought about it years ago, before the internet and decided I needed to find the number that when multiplied by itself twelve times equalled 0.5 as the twelth fret is halfway down the bridge length, so it must therefore be 12th root of 0.5. In python you write it as

0.5**(1/12)

In case you want to script it like I did. I just iterated this 25 times (1 for the nut) to get my frets from the bridge. Sorry for any typos, posting from my phone is not so easy. :writing_hand:t3:

Cheers, Clock. :beers:

OK some progress now I am back home again:

I have now built a node that plays guitar from the controls built from MIDI files by my MIDI Controls Node (see my website for details). This node produces a series of empties animated to the MIDI file. I am using these to drive the new node that works out which string and fret to play for the note given and moves my “Finger” object to the correct fret and string. This object is rigged with an IK chain.

It also changes the material of the string that is being played, “Snot Green” in this case. This shows that a string is being played even though the finger might not move, it is just raised if the string is played “open”. Next is to vibrate the string being played a little and perhaps have a plectrum, or other finger, plucking the string.

All this in between bouts of jet lag and getting to know Jim and Jack again after three weeks away from them. I am also working on some nodes to store variables for another Blender Artist, such a busy life, no time for the “Honey Do” list just now, that will not go down well…

Cheers, Clock. :mantelpiece_clock: :beers:

1 Like

Update:

Now I have the six string version working, I have incorporated an “Octave Shift” function to move the notes up, or down, the guitar. For test purposes here I used a bass line and shifted it 2 octaves, so it fitted the lead guitar. I must get a lead solo converted to animation at some point. I might use the Pink Floyd Comfortably Numb one I had my robot play some while ago under my old system.

I have done some work on plucking the correct string, still a way to go here though as this involves a movement over several frames. I did this before by combining the FCurves of all the notes played to drive the plectrum, I am not sure this is such a good idea now.

I will look at playing chords next on this side of things, this is going to be challenging, as is holding the strings with multiple fingers, rather than just one… My thoughts are to use the nearest finger to the required fret/string position, but I have not worked out a spec of how to do this yet.

We are still on track to have our live play MIDI and Audio system on my GitHub by end of this month, some final testing to go and all should be ready.

Cheers, Clock. :guitar:

So, here is a rendered view:

I have now attached all the bits to a pivot, so I can incline the guitar anyway and it still works. The Text Object outputs, String used, Fret used, Note Id and Note Name as the animation progresses. I might render a quick video of this…

Cheers, Clock. :mantelpiece_clock:

3 Likes

Test video of the new node:

I know it’s a sax playing, but I did not have time to change this to a guitar. :guitar::smile:

The node seems to work well and I can find no bugs yet! I will upload all these new MIDI and Audio nodes to my GitHub later this month.

Cheers, Cock. :beers:

1 Like

I have got the plectrum o the correct string being played now, I just need to work out how to get it to pluck the string… :confused:

Cheers, Clock. :mantelpiece_clock: