Everything nodes

I have just read this entire thread - :crazy_face:

And what did I learn, we are all as close together on this as electrons to a nucleus, in relative size terms.

One issue with I had with Animation Nodes is the relative simplicity of the nodes in terms of their functions. It was often hard to remember how I had built a node tree to do a simple (on the face of it) job. So I tried experimenting with Script Nodes and Expression Nodes. Put this in an Expression Node:

((x + pi) * sin(y-z)) + (sqrt(a**2 + b**2 + c**2) * ((e-d)**3))

A bit extreme I know (8 inputs), but how many normal AN nodes do you need to do this? Answers on a postcard please. Then consider that you have to remember the formula, so where do you go from here, answer: write a node to do it. I know that very few of us are writing our own AN nodes, I personally have written over 150 nodes so far. Example tree below, only the grey ones are native AN:

This complete node tree controls a Digital Audio Workstation function for three channels in Blender and outputs the result to a .FLAC file and runs in realtime. This cannot be done with native AN. The answer to make things more “Artist Friendly” is to go big on the nodes themselves, so they perform a complex, often repeated function in a simple fashion. Detail below of the Setup Section:

There are some large nodes in there, but they are easy to use and make the tree manageable, note the execution time for those who think complex nodes are bad. This could be done with a modified GUI, but I would then need to build and register a full Add-on rather than write some supplementary nodes, I know which I would prefer.

Here is another example of a single node that operates in two modes “Full Fat Cream” and “Skinny Latte”:

This single node allows me to control and keyframe a complex animation from my MIDI keyboard, very quickly and very efficiently.

At least a year ago I found that there was no simple way to control Armatures from within AN, so I wrote a “Bone” socket file and several nodes to control bones from within AN. It works very well and very reliably.

I also built a system that reads MIDI files and animates objects from them. My first music video took me 4 weeks to animate a simple one instrument on a 2 minute song length. One node I wrote takes 0.5 seconds to animate 72,700 note events in a MIDI file for a 4.8 minute song playing 55 different notes on a piano - I rest my case M’Lud. Here is that one node:

There are 286 lines of code in this node BTW.

So we should not pre-judge Everything Nodes until we see it. Very complex tasks can be written into very easy to use nodes, if someone is prepared to do the donkey work, as I am. For the DAW suite I have written a shed load of nodes, including Sound Generators, Synthesisers, many Sound Effects and so on using AudaSpace API. It is possible to make a node based system easy to use, appear other than a mess of spaghetti and make it very fast/efficient.

Just my two cents worth and I am sure this will kick the hornet’s nest nicely.

Cheers, Clock. :beers:

13 Likes

dude, what are these wizards? :sweat_smile::grin:

1 Like

Oh wow. I can’t wait to see what midi/DAW-related stuff you’ll be able to come up with when the Everything Nodes is ready to use :grin:

Be sure to share that stuff here, because some of us are very interested in that sort of thing, and might lack some of the skill :roll_eyes:

2 Likes

The General and MIDI stuff is all on my GitHub Here, and some of the documentation is in the “Master” branch readme, like where to put various files (menu, bone socket, etc. My Website is not up to date with the instructions for newer nodes and 2.8 install, so much to do here still!

These nodes are designed to supplement standard Animation Nodes, whether they make it into Everything Nodes is really up to @Jacques_Lucke and beyond my control. I do intend to make the DAW bits accessible shortly, but I don’t think they will get into EN unless there is sufficient demand and a lot of checking of conformity to Jacques’ methods and standards.

The Audio nodes need the SoundDevice Python library installed, with 2.8 this is much easier, I believe python3.7m pip install sounddevice does the trick, but may need to check that. The MIDI stuff needs PyGame Python library installed, again I think python3.7m pip install pygame does this.

The DAW stuff is not loaded to my gitHub yet, I still need to do some more work to get it to a stage where I can release it. There are currently 58 nodes and one functions file there to check and debug. I will also need to upload a .dae file to hold all the objects for a DAW project and write instructions on my website, so some way to go here also.

If anyone want to start looking at the DAW nodes, let me know and I will see what can be done, maybe an upload, with many caveats as to the state of the nodes, could be done shortly. Getting a fresh perspective on this work would be advantageous to development.

A Thread in the WIP section details progress with the MIDI, Audio and DAW development, it is reasonably up to date and provides some insight into the thinking and difficulties with loading python modules in 2.79, etc.

Thanks for the all the likes!

Cheers, Clock. :beers: :grin:

2 Likes

Update: DAW nodes are now available Here (in the nodes/daw directory). :flushed:

Caveat Emptor - they are very experimental still.

Cheers, Clock. :beers:

EDIT:

Here is a screenshot of the system:

4 Likes

That looks amazing! I’ve been looking for something like this to do visualization animation.

1 Like

hey @clockmender now that you remind me of something 
 :grin:

I had made this proposal on Right-Click Select, can you take a look at it?
it would be wonderful a musical traker built in blender
hehehe

1 Like

OK, I have read all that, I believe I have already achieved everything that has been asked for over on your link. I have sounds generated from the location of objects, a piano roll in effect, sounds generated from animated objects, sounds generated from keyframed events, sound generated from MIDI file events, etc. and so many “effects”, like Delay, Echo, LFO, HFO, Flanger, Phaser, etc. it’s almost unreal!

I have to thank @neXyon for all his help and the changes he made to the AudaSpace API, at my request, so we can have a modulator function, amongst others.

Maybe we should discuss with @Jacques_Lucke the possibility of having a Music, or Sounds section in Everything Nodes that deals with MIDI Animation, both from file and live and Sound Generation & Manipulation from animation. Maybe others here can help, I don’t have much influence over Jacques for node development and have never been invited to participate in the AN development team, or had any of my developments included.

I think that the first thing to establish is a requirement for this type of development by more than just myself and @Ivano_Da_Milano, whom I have been getting lots of good ideas and encouragement from. There is nothing like sufficient demand to fuel inclusion and further development. I would also welcome someone with greater Python knowledge than myself to look at my nodes and suggest improvement in the code, I after all, am not a long term Python expert, just an enthusiastic amateur. :brain:

Let me know if you want more info here, I have to say that I developed all this DAW stuff very recently and there is absolutely no help file, or manual available yet. :upside_down_face:

Cheers, Clock. :beers:

2 Likes

Would it be an idea for you to link this thread on your “Right-Click Select” link? Not that I class myself as a Genius by any stretch of the imagination
 :rofl:

Cheers, Clock. :beers:

EDIT:

The comment made about ray-tracing sound by david2 on the link is possible with AudaSpace API as there is a function to set position of mic, speed of sound, etc within the API using the aud.Sequence() function, see this. I have not implemented this yet, but have made a “doppler” filter using the aud.Sound() functions.

EDIT:

If anyone wants to review the python code, I should be very grateful! I am not a serious python developer


Jacques wrote some documents about some topic related to everything nodes project, in the simulation one he’s asking technical artists to :

https://wiki.blender.org/wiki/User:JacquesLucke/Reports/2019#Week_35:_May_27_-_31

I think he has taken that out:

Cheers, Clock. :cocktail:

No, it’s from a link found a few of lines above your screenshot : First steps towards making simulations “first-class” in Blender.

I have some reservations on the way he suggests to handle the “simulation world”. Making it an ID can be interesting to be able to share simulation settings between files or even in the same file, but if input objects are part of the settings, then you can not reuse the simulation with different objects.

For input objects, each simulation type would need to have its own specific input nodes to gather objects and their specific simulation settings.


Then I don’t think the current function nodes are fit for simulation purposes, unless every function carries a lot of data (e.g. simulation states). Precisely (unless things have drastically changed in past 2 months), function nodes acts as lookup functions like the cycles node graph, but this type of graph doesn’t allow things like blurs, because you would require access to the neighbourhood of the lookup point, which is required for simulations (advection of fluid fields, constraints of cloth springs, etc
). In such a system simulation could be doable, but with a messy code base, including lots of synchronisation primitives if multithreading is to be considered.

Inputing the last frame back into the right “function” instruction seems to be tricky or quite impossible in this system. I’d like to see how it is (or would be) implemented.

Really I don’t think the concept of function fits here at all. It’s too low level


4 Likes

@KWD please, please share those thoughts with Jacques Lucke (maybe for example on blender.chat or on devtalk). Feedback is very important, especially for such architectural details, because obviously, the system will be hard to change if it’s completed.

7 Likes

I think Jaques would welcome your input on how to make Everything Nodes more robust than it is now. To ensure it lives up to the idea of bringing chunks of Houdini’s power to Blender would be wonderful.

8 Likes

Jacques Lucke committed for initial empty particle nodes modifier, maybe we will be able to start doing some particles nodes soon ? This is going to be interesting :slight_smile:

5 Likes

I’m a motion designer and I do have plenty of ideas for particles nodes, which could be a game changer on the motion and VFX field, which nowadays is dominated by Cinema 4D (with x-particles).
I will try to share some ideas
 can someone tell me if there is a thread specially focused?

1 Like

try to mp him in devtalk or maybe start a topic there

Just contact me in blender.chat.

1 Like

I noticed quite a few commits regarding experiments on particles systems and events. I tried the branch, but I guess it’s still heavily WIP, and there is no real usable or user-testable things yet
 (There is a crash if the input node does not have the required signature, but that will have to go eventually). Oh, well, let me share the results of my latest experiments and insights.

I finally got around to refactoring the software to allow for a more flexible scene/object model, with a simple dependency graph. So now I can better test ideas for how to handle simulations :

Simulations are a bit tricky since not only do you have to plug back the previous frame from the computation, but also each simulation has its own state and inner workings. For this, at first I tried to actually have a global “state cache” at the simulation graph level, but this would prevent to have various simulations of the same types (like two smoke simulations) in the same graph. Therefore I decided to have each base simulation node define its global state and pass it to parent nodes where we gather data for the simulation. To visually explain, here’s the node setup for the video :

So basically, there is some sort of back and forth between nodes to determine what is to be done, and the simulation state is passed around in a local branch of the graph. This also means that a given simulation input node can be reused with different simulations in the same graph.

Also, this node setup is inside a Simulation node which abstracts common features of simulation (caching, stepping, inputing the last frame’s data, etc.).

Overall I believe that this is very similar to (if not an accidental clone of) Houdini’s DOP network.

3 Likes

Are you in contact with Jacques? i’m sure your experience and knowledge is like gold to him.
And have you thought about joining him in the everything nodes project? :wink:

1 Like