BGE and Ableton live ?

Is it posible to sync somehow BGE and Ableton live?

It would be awesome !

Do you know Live ?

I think if you made the BGE appear as a re-wire component to the OS, it might work.

I wish somebody would try to connect them , it think it has so much potential.

How can this be done ? python ? or must be c++?

I don’t think that anybody really knows what Ableton is. I found out by looking it up on wikipedia, but you need to give more information about your posts…

Are you actually just wanting to use music made in Ableton IN Blender? That’s easy if you just want to export the loop to a .wav file, or a normal music file if you can use Python to play it.

I don’t know exactly what/why/how you want to interface between Blender and Ableton, but any of this work would probably have to be done by you, and you could probably do it in Python. Blender does support real-time script editing, and can also read in input from a file, but you’d have to figure out how to export anything from each of the programs. If you’re not a programmer, you probably won’t be able to do this easily, and if you are, it still won’t be easy.

I’d learn Python as a first step, and then figure out what you actually want to do with the two interfaced. Can it be done other ways with JUST Blender or JUST Ableton? This is going to be something that you probably will have to figure out.

-Sam

Hey, I am really interested in doing a realtime audio-visual performance and I am planning to use Live with BLender
The answer is YES there is a way to connect them. There are many python scripts out there that can receive osc into BGE and be used to change properties, positions, whatever you like.
From live to osc, you can either try a custom patch using max for live, or you can just use an extra layer such as GlovePie or Pure Data to transform your midi into osc.

It is probably not the cleanest or most efficient way but it works !

As a summary I get midi out in from Ableton live, transform them in osc from pure data and then receive the osc from a pyhton script in BGE. You will need to have some internal midi routing drivers such as MidiYoke on Windows.

The good thing of using osc is that you can use two computers connected via network, such as a Mac for Ableton controlling a Linux PC for Blender.

Cheers

please let me know of your progress , thanx !

soyouth, sounds like you are a few steps ahead of me . . . my plan, though, is to use interaction within blender to trigger events in Live. I’m familiar with the Controller (like TouchOSC) ==> OSC ==> PD ==> MIDI Yoke ==> Live architecture, but haven’t found any docs on how to send OSC from Blender out to PD; there seems to be plenty for getting OSC in, but not out. Have you found anything? And please do let us know how your setup/performance is progressing!

Thanks in advance!

m e m e t i c

Hi Soyouth, Yeah, I’m sure it’s possible to do this. I have Ableton 8.26 and it has Max for Live, but I’m not sure how to get it all working. Your suggestions are great, but I’m not sure where to begin and I don’t really understand all of what you say. Can you provide a template with some instructions that can run out of the box, just to illustrate the basics?

Thanks,