Real-time mocap + softbody stuff; game engine or not?

I’m planning to embark on a crazy project to use a Kinect to control armatures. The mocap side of it is more or less irrelevant though, because once I grab the data via Python it won’t care where it came from. What I’m wondering about is this:

I want at least low quality soft-body stuff that I can render in realtime, but also be able to go back and re-render with much higher quality later without the realtime constraint. Mostly stuff like hair, where the exact effects don’t have to be the same in low-quality realtime and high-quality render later on, but also some more impeding things (avatar has horns = hands should not clip through them as long as you don’t push too far). Also, want to take the resulting poses / rendered images and export them realtime.

Is the game engine appropriate for this, because of its inherent realtime nature, or should I avoid it for other reasons? Either way I understand that the code involved will be massive.

Thanks!

The BGE does support soft-body physics, as well as the option to record game engine physics to an IPO for later rendering / use, so perhaps it will meet your goals. Maybe you should try out a small portion of the data to test the process out.