We're In The MIDI & Audio & DAW Real-Time!

Plectrum is now plucking strings, animation is over the next three frames after the note is played and the strings vibrate, when being played, using the nodes shown here:

Some serious testing is now due, but I am pleased with progress! :smiley:

Cheers, Clock. :mantelpiece_clock:

1 Like

Just LOVE talking to myself here! :rofl:

Here is the next video test:

We now have the plectrum animated by the Play Guitar node and vibrating strings done by standard AN nodes. :guitar::vibration_mode:

Cheers, Old-Man-Clock. :older_man::mantelpiece_clock::beers:

EDIT:

Two pictures with a body:

Primitve modelling, but serves a purpose. :smile:

naah. audience is just speechless.
love what you are doing here, connecting blender with sound stuff is one of my main interests.
makes me giddy :monkey:

I recently dreamt that somebody added a complete DAW in blender, with node synth, sampler, sequencer etc… but I woke up and it wasn’t.

1 Like

Hmmmmmm, tempting… do you have any ideas for a specification, or what you might like in one? :yum:

Cheers, Clock. :beers:

I was thinking about merging LLMS , Audacity and Blender.

LLMS is a DAW without audio editing

https://lmms.io/

Audacity is well know audio editor

https://www.audacityteam.org/

Both projects are GPL 2 too which is a bonus, license wise.

It’s not a walk in the park but doable and one I may give a try.

About this project, looks interesting but unless we get to use the code you will have to keep talking to yourself. Unless you plan to sell it, I don’t see why so much delay to give a even experimental release. You can also include all dependencies with the add, just use virtual env and then copy paste the libs in the addon folder. I did that with PIL so the user wont have to install PIL to use my addon.

Thank you for the comment, however, I have no plans to sell this ever! Delay is caused by two things:

  1. I had a three week holiday away from it all.
  2. Too much else to do, especially for Mrs. Clockmender…

I have said in an earlier post that I will get the nodes up to my Git-Hub by end of this month and I am still on-track to do this. I also explained in an earlier post how I got the libraries loaded. As I understand it there are different libs for different operating systems, I have only used the Mac ones as I am Mac based.

Making an Add-on out of this is way outside my knowledge just now, so any help you can give there would be greatly appreciated. I will have to do a lot of research I think…

I have used LMMS in the past on my Ubuntu machine, but these days I just use Reason on my iMac with a connected keyboard and guitars through an iRig interface, so am out of date here also.

Anyway, must get on and get this stuff tested a little more…

Cheers, Clock. :beers:

Fun fact: I am avid Reason user, well avid enough to pay for the real thing and its upgrades. Although not the latest one, I was not as tempted. I love Reason.

Woman always is top priority of course.

There is no hurry , you are the boss, I am more than happy to wait and try your code.

Reason is not the only thing we share, I am also an iMac user, but in my case my mac is triple bootable , 3 partitions one for macos, one for win 10 and one for latest Ubuntu. The reason for this insanity is that I develop a commercial version of Blender that I hope to release soon with custom GUI and I wanted to support all 3 OSes. If your aim is to bring DAW like capabilities in Blender I am definitely interested in helping.

We also share, I know it gets creepy, the fact we both depended on pygame. I made a project ages ago, Vriareon.

Long story short I own this beauty


[For anyone not familiar with synths] Andromeda A6 is by far the most powerful non modular all Analogue synth. With features even unheard for digital synthesizer or even software synthesizer. However it misses one important element that I rely on as an Ambient composer. Multi point envelopes. Basically modulation (think of it as animation of a property) when a note is pressed.

So I made a small project that allow me to receive MIDI from the A6 , enrich it with mulitpoint envelopes that I could draw via a custom GUI I designed and then send the MIDI back to trigger the sounds. It all used Pygame because it was easy. The challenge was the Andromeda being a monster synth has over 800 parameters so it cannot use the classic MIDI CC but instead it uses MIDI NRPN , similar protocol but double the size as messages for more parameters.

Ugly GUI I know,code is probably uglier, but it was not meant as a serious project just to satisfy my curiosity.

I am definitely interested in bringing DAW like capabilities in Blender, obviously I prefer using Python. So yeah I am definitely interested if you aim for something similar. Nothing too ambitious, your project seems to be pretty close to what I imagine. You seem to go down the route of Reaktor for MIDI of sorts which I like.

I love Guitars but alas never committed into learning to play one other using my synths (I have also a Yamaha Motif ES6) to fake playing guitars.

Turning something to an addon is a walk in the park, the tricky part is the GUI. I can turn it to an addon for you , if you promise to comment your code so I know what I am doing. Designing a GUI will be longer.

1 Like

Now I have some time away from my “Honey Do” list…

Yes I love Reason, I have been using it for some time now and like you have a fully paid legitimate copy! What a goody-two-shoes I am. :smile:

I threw this together last evening:

Four new nodes had to be written, they are shown here and do various things, like set the FPS value, animation length, etc. All works well and it actually plays small sound files using PyGame Music libs, using the script node.

But, there are issues, 1) Getting PyGame to recognise AIFF, or WAV, or M4A files, no it won’t have it - AIF, MP3 OK, grrr. 2) It will only play one file at a time, so multiple notes, forget it! I think we have to work out a specification for how the hell it is going to play sounds first, then I can take the code further.

PM me with an email address and I can send you the nodes I wrote, I will also send you the MIDI and Sound nodes I have written, not that they are commented particularly well just now… I will put that on my ToDo list for a later date. One option for communicating, I use Discord to talk to various people I collaborate with on various projects, if you would like to use this instead of email, it is far easier to pass files, let me know and I can pass over my Discord details and we can use that method. I think this would be a good thing to develop, I have MIDI working on Blender, but this takes it to another level, maybe others would also collaborate with this?

On the guitar front, I am a Bass man really and have been playing that since the early '70’s. I am now on my fifth home made fretless version. I also have an electric lead and an acoustic. On the keyboard front I use a StudioLogic TMK88, linked through an iRig to Reason.

Cheers, Clock. :beers:

PS. Excuse typos, I am in a hurry just now…

I am replying myself, because I like doing that. :stuck_out_tongue_winking_eye:

Here is a really bad video of the system, working really poorly:

I think radical re-thinking is required! :rofl:

Cheers, Clock. :beers:

impressive!

you can reach me via Discord in my private server here , https://discord.gg/4pmTmd

this is going to be the home for my commercial project, Ephestos, I also decided to have a free version for it.

Code wise, I can upload it to gitlab for you , where I am active and maintain it there. In case you dont know how to use git.

Sound wise, I am not sure how Blender fits into this, I know that it can handle audio files but never tried myself, but because I wlll be providing a free version of Ephestos, which is a fork of Blender, I can implement the features you may need to tame the true power of Blender.

The free version of Ephestos is called “Ephestos Core Entry” and can be found here , I am mentioning it because I am developing and secondary GUI API , both in C but mostly in Python which makes it ideal for your situation that want a more customized GUI. It’s not quite ready for public consumption, the GUI is called Hecate and used to be called “Morpheas” and it has been used in a commercial plugin , it has now a diffirent name because its drawing is in C integrated with Blender source, unlike Morpheas which was mostly Python.

In any case I will be making an announcement in the near future.

Ephestos Core Entry can be found here

https://gitlab.com/Kilon/ephestos_ce

Hecate can be found in scripts/modules/ephpy/hpy , main is core.py and is fully commented.

https://gitlab.com/Kilon/ephestos_ce/tree/ephestos_ce/source/blender/ephestos/scripts/modules/ephpy/hpy

1 Like

I have re-written the previous two nodes I had to create Controls, or F-Curves, so now one node does both:

The node tree to use the F-Curves in an animation is also much simpler.

It stores all the data in Dictionaries now (Yes, I learned how to use them…), so is much more efficient and uses far less code. You can have multiples of this node, one for each channel in the MIDI file.

I have also taken all the common functions out of the MIDI nodes and put them in a separate .py file, so I only have to update one file if I alter the function. So yes, I have learned how to call external functions in nodes as well, what a busy chap I have been. :smile:

Cheers, Clock. :beers:

PS. The previous release of the “Live” Audio and MIDI nodes are now on my GitHub, but bear in mind they have changed in the last few days, so I need to upload and make a new release in the next few days.

2 Likes

OK Urgent Request!!!

Does anyone know how to get a list of the new icon names in 2.8? Preferably wit a picture against the name.

Please post here if you know…

Cheers, Clock. :beers:

EDIT:

Little teaser…

Yes it’s 2.8 and AN2.1 AND my nodes…

Well Done Clock, pleased to see it working in 2.8. :stuck_out_tongue_winking_eye: Here’s a picture of things as they are to date:

I have still to cure the issue relating to the Bake to F-Curve function, it appears that something has changed in Blunder so my F-Curves don’t get written properly, but the Controls function works perfectly and I am sure I will get the bake sorted at some stage… Strange it is that all my complex build code works in 2.8 as it did in 2.7 except for this one thing, here is the code, just in case someone can help, or make some suggestions of what I might try:

            # Clear out all F-Curve Data
            self.notes.clear()
            self.removeFCurvesOfThisNode()
            eventD.clear()
            # Read Channel from MIDI file and polulate eventD dictionary
            self.writeEvents()
            # This function creates an abstraction for the somewhat complicated stuff
            # that is needed to insert the keyframes. It is needed because in Blender
            # custom node trees don't work well with fcurves yet.
            def createNote(name):
                dataPath = "nodes[\"{}\"].notes[{}].value".format(self.name, len(self.notes))
                item = self.notes.add()
                item.noteName = name

                def insertKeyframe(value, noteIndex, frame):
                    item.value = value
                    item.noteIndex = noteIndex
                    self.id_data.keyframe_insert(dataPath, frame = frame)

                return insertKeyframe

            # Process EventD to make F-curves.
            for rec in eventD.keys():
                ind = getIndex(str(rec))
                addKeyframe = createNote(str(rec))
                indV = True
                for i in eventD.get(rec):
                    frame = i[0]
                    val = i[1]
                    if indV:
                        addKeyframe(value = 0, noteIndex = ind, frame = frame-self.easing)
                        indV = False
                    print(val,ind,frame)
                    addKeyframe(value = val, noteIndex = ind, frame = frame)
            self.message1 = 'Channel '+str(self.chnN)+' Processed, Notes '+str(len(eventD.keys()))+' Events: '+str(sum(len(v) for v in eventD.values()))
            self.message2 = 'Complete'

It just won’t write the correct MIDI note values to the F-Curves, the correct data has been written to the event dictionary (eventD) and is being fetched back at the correct time… :confused:

Next I need to get PyGame and SoundDevice modules loaded into Blender 2.8 Python and check the “Live” stuff works OK.

Cheers, Clock. :musical_keyboard: :musical_note: :beers:

PS. I have uploaded all the new 2.8 code to my Old-Git-Hub under a new branch for Blender 2.8, AN 2.1. :up:

2 Likes

Yes Clock. I do, but I am not going to tell you. :rofl::rofl::rofl:

OK I had a little idea about using MIDI controllers to help in the animation process, so, I came up with this new node, the big snot coloured one:

It takes a start key and uses the next three octaves, well not every note, but a lot of them. they are used to move the select objects, or bones in any local axis, either up, or down, rotate backwards, or forwards and to scale up and down. Some keys are used to advance the TimeLine, some are used to change the Offset and one is used to insert keyframes on all the selected objects, or bones, at the current selected frame depending on the status of some checkboxes.

I did a quick check using an integer input and ally seems to work well. Tomorrow I will copy the node to m iMac, plug in the old TMK88 and play. I found that running AN “Always” at 0.1 second min elapse time was good.

if you hold one key down, the objects keep moving until you let go, to nudge them very small amounts, you can change the offset value. I am going to investigate feeding in the Mod Wheel, this gives values of 0 to 127 depending how you turn it. This may be useful for getting precise movements/rotations of objects.

I did a quick test with all the finger bones of one of my Virtual Concubines (Sophie form my Heaven’s Angels WIP) and found I could screw both her hands up into a fist and keyframe the whole lot in just a few seconds, even using my Integer input, it will be much faster with the keyboard. :cloud_with_lightning: (is this the best Lightning Emoji we have, 'cause it’s crap)

I’ll get this onto the Old-Git-Hub later this week, or next, depending on my testing, workload and levels of alcohol consumption. :cocktail:

Cheers, Clock. :cocktail::cocktail::cocktail::cocktail::cocktail::cocktail::cocktail::cocktail:

PS. Hick, I shuposhe you fink zhat’s fun-hick-ny

oh man, this is just amazing. cheers.

I should absolutely try that addon asap :slight_smile: Good Job.

I have not got as far as making a proper add-on yet. but the procedure will be to load PyGame Python module into Blender’s python directory, then load the nodes into a standard Animation Node install. All is explained on the readme on my Git-Hub. For now it is only blender 2.79, although I now have a Blender 2.8/AN 2.1 branch for some of my nodes and I will mod the code for this latest node to include it.

I have still to test PyGame in Blender 2.8, but I understand that 2.8 has pip, or pip3 installed, so this makes the process much easier, in that you don’t have to install pip first. Earlier posts here describe what I had to do to get PyGame installed on 2.79 BTW. Other testing is going well for 2.8/AN 2.1, I am going to stop wok on the 2.79/AN 2.0 branch shortly to concentrate of 2.8.

Another chap I am in contact with has a MIDI controller, we may well try t use this as an alternative to a keyboard. The knobs on this will give more accurate control of objects and input values. I can mimic this using my keyboard’s Mod wheel, so I will also test this later today.

One thing, when testing new node code, I now run blender from a Terminal on my Mac, so I can see error messages and print output to test how these nodes are working, that’s my “Top Tip” for today! here’s and example:

I just “cd” to the Blender executable, not the app on a Mac, and use ./blender to start Blender. It also tells you straight away if there are any problems in your node code!

Thanks for the kind comments guys! I appreciate the support.

Cheers, Clock. :cocktail:

I tried a port to 2.8, but hit two problems:

This code does not work inside Animation Nodes:

bpy.context.selected_objects

Same for this:

bpy.context.selected_pose_bones

But they both work in the Python Console. grrrrrrr What a bugger, does anyone know a workaround for getting selected bones?

Workaround for objects is this:

[i for i in bpy.data.objects if i.select_get()]

But I cannot find a workaround for getting selected bones…

Can anyone help here?

Cheers, Clock. :cocktail:

Hello Clock you sad old git! :stuck_out_tongue_closed_eyes:

Fortunately I know how to do this, did you try:

[r for r in [i.data.bones for i in bpy.data.objects if i.select_get()][0] if r.select]

Tortuous I know and you should probably break it down into four lines of code:

selObjs = [i for i in bpy.data.objects if i.select_get()]
armList = [i for i in selObjs if i.type == "ARMATURE"]
armObj = armList[0]
boneList = [r for r in armObj.data.bones if r.select]

That stops you looking at selected objects that are not Armatures and screwing things up by doing that… :stuck_out_tongue_winking_eye: Then when it comes to actually moving the bones you should try this:

                for ob in boneList:
                    b = armObj.pose.bones[ob.name]
                    b.location = b.location+self.locVec
                    if b.rotation_mode == 'QUATERNION':
                        a = self.quaVec
                        i = b.rotation_quaternion
                        b.rotation_quaternion = i+a
                    else:
                        a = self.rotVec
                        i = b.rotation_euler
                        b.rotation_euler = Euler((i[0]+a[0],i[1]+a[1],i[2]+a[2]),i.order)
                    b.scale = self.sclVec

Because, of course you cannot alter the location of a bone from data.bones, only from pose.bones and you cannot see selected bones in pose.bones, only data.bones - I hope that it all “Chrystal” clear. Jeez, old people find this stuff so hard, get a grip mate! Note how to do the Euler Maths, so different from Quaternion maths. :thinking: Thinks, who designed such a system like this and why?

Cheers, Erki :robot: :oil_drum::toilet:

Oh thanks Erki you cheeky tin-dicked “bar-steward”, so armed with this new found knowledge I was able to do this:

My MIDAS node running in 2.8, just have to get PyGame working now. I also altered the node so it can take input from my keyboard’s Mod Wheel as a variable to move meshes, or bones, this is fed into the Parameter Input and displayed on the node.

Cheers, Clock. :cocktail: :roll_of_toilet_paper:

Now all is tested and working in 2.79 with my MIDI keyboard, I had make a few changes to the node, but got there in the end:

It works with objects and bones. One issue I had to resolve was to modify the Single MIDI Event node that feeds the MIDAS node, so it didn’t preserve the last input note, or the system went haywire and wouldn’t stop flinging stuff all over the place!

The setup:

Cheers, Clock. :beers:

PS. I really must get the PyGame module in 2.8 now. :stuck_out_tongue_winking_eye:

PPS. Maybe Should make a quick video from my 'phone of the node in use? :video_camera: