Sounds triggered by collision

Hi everyone

As the title says im looking for a way to trigger sound files by a collision.
I recently made a test animation of a few glowing balls falling down stairs. Now i wanted to put some sound on it and came up with the thought "why matching all up manually when maybe blender (or a script but im new to phyton) could do the work much faster and more precise then i could ever do.
Now my question is, is there such a function/script already there or where do i find decent tuts and docs to maybe write my own script (if possible at al)

thanks in advance
BK

http://blendit.xaa.pl/?p=middrv&l=eng
Its a script for blender 2.49 but it might give you an idea

So how are you going to render your sound?
I don’t think Blender does that, even Cinema 4D can not do this.

Usually it is the other way around as in that MIDI script. You animate to sound. Just like when doing lip sync, the sound comes first.

If it were me in your case, I’d just edit the sound in a NLE to the animation if I had that already done. Can’t be that hard or tedious.

Can’t be that hard or tedious.
It can be hard and tedious if lots of objects are involved and you want different sounds for different surfaces. Say the objects make a “bloop” sound when they hit the red object and a “blip” sound when they hit the blue objects.

That is why I mentioned “How are you going to render it?” Even if you make a python script that calculates when the sounds are to be played, I am not aware of any mechanism that can actually play a sound to a rendered file. Perhaps the game engine can render a WAVE file output?

@Atom
admittedly I don’t know of it being done already. But I don’t think its impossible. If you write a script to drive the animation you can use a corresponding script to drive a synthe.
Possibly you could you could use the animation curve. to set the time. then use the peaks and troughs to determine pitch. Basically if you use midi to make curves you can use curves to make midi.
Hey if you really really wanted to. you could use histograms to generate curves too.

I am a complete beginner to blender so I’m still discovering what is possible and what not.

even if you make a python script that calculates when the sounds are to be played, I am not aware of any mechanism that can actually play a sound to a rendered file.
Isn’t blender capable of rendering a sound to an animation (thought I saw a audio option in the render menu)? Or do you mean multi-channel sound compositing so that the differen sounds can actually overlay? In the second case maybe here is a way compositing this like compositing videos with nodes?!

@blenderKiddy
Audio in blender is mainly for syncing animation to sound. The opposite of the video player in a daw which is mainly to allow you to sync audio with video.

I Don’t know python at all but…
If a collision triggers an event that can be caught by the python then it should be possible with something like this. (psudo code)

On collision detected
Retrieve time line data
Collision sound = instrument(collision type)
Write midi output: collision sound, time line data
Done

Obviously the above is dependent on various bits of data being accessable and translateable into a meaningfull midi stream and also knowing the format of a midi file.

Another alternative might be…

If blender can handle multiple soundtracks then for each collision create a soundtrack object from a set of sound files at the relevent time point.

For anyone who is still interested I recently came across the same problem so I made an add-on to do it.

1 Like