Music/Sound as modifier or node

There’s a thing I cant get my head around. I’ve started doing Processing just really babysteps, play around with the i/o of the camera and display it diffrently… here’s my first demo.

Anywho, this concerns my main reason why looking into processing, to create particle systems and modify 3d meshes based on the sound input.

Which leads me to my fav. 3d suite blender ^-^ , when 2.5 is all done, is there someone willing to teach me PyNodes or interested /& and with the skills to program a modifier for blender where you can get the diffrent Hz amplitudes , realtime or recorded to affect a mesh/3dbody ?

My approach is the dev. dreaded (nodification of everything) node that can read an xml with values or mp3 / wav and plot the amplitudes and then you scale it to fit whatever you want to.

With a modifier you get it straight onto the mesh, wich is good, but if there where nodification of everything, a node could also do this.

You could have a node to read of a wav file and affect a mesh as a modifier, and also change color/settings for the materials. Just an example.

This sounds weird, but the more you think about it, this/blender is leaning IMO more and more to a huuuuuuuge node system. Where you combine …

  • particle system (smoke, boids everything)
  • materials
  • textures
  • modifiers / mesh
  • render tree

And please not , “yeah you do it, code it” … blender has never been a one-man-show. Im willing to spend time working on this in my spare time if there’s someone else who thinks it’s necessary and can help out.

If there where a true node system connecting really everything in blender. there would be infinite creativity really at the artists hands and really easy GUI of just … makin nodes and huge node trees.

I think there is a script that will do most of this

3point is correct there is a script that converts a wav file into a ipo, but not knowing how to then connect the ipo to the mesh i cant do what i want it to do, the script can be found here.