Blender Game Engine interface to animatronics

I’m working on an animatronic project for Halloween. I have a simple mechanical armature with a few servos and some lights. I’m using a Lynxmotion ssc-32 to control it. I’ve tried all the animatronic software I can find, but it all seems very limited. They have a single timeline, only do linear interpolation between keys, have no idea of inputs etc.
Blender’s IPO window is orders of magnitude more advanced, so I started to wonder if it would be possible to use the Game Engine to drive output to the servo controller. I have no experience with the game engine, so I’m looking for feedback to see if “that’s crazy” to “it should work”. Is this a practical idea, or am I trying to twist blender into a use it’s not suited for?

I see that BGE has a concept of sensors, and I can attach a python script to an always sensor so that it runs for each frame. Great, so I should be able to find the bones I care about, read their rotation and translate that to my servo controller. Any gotchas with that idea?

Is it possible to code new sensor types easily?

I see that BGE tries to make it easy to graphically connect sensors-controllers-actuators, which looks useful, but I’m a programmer, so is there a way to do that more simply/flexibly from python?
It seems like my per-frame python script could just look at the keyboard etc and start actions without having to wire up all the blocks.

I see how you can fire an action and start a sound from 1 controller connected to 2 actuators. Can I also do that from the Python API?

Is there any way to associate an action with a sound other than triggering 2 actuators at the same time?

My goal is to be able to have a set of routines (for example jokes) that will run randomly, but if a sensor is triggered, have the character switch to a specific routine.

Thanks for any input or suggestions.


Chris