Hello Blender Artists!
For the past year I’ve been developing an application called NextStage that turns the Kinect into a realtime virtual production camera.
By tracking retroreflective markers NextStage can instantly find the Kinect’s position and rotation. NextStage can also separate actors and objects from the background without the need for greenscreen, record uncompressed RGBA video, export data to sync an external camera, or stream the tracking data out to other applications.
There are two versions. NextStage Lite is available for free, and NextStage Pro can export tracking data as a collada .dae file. NextStage Pro also comes with a built in preset so that the tracking data can be easily imported into Blender.
For the beta release NextStage only supports .obj files as sets, and lighting effects need to be pre-baked. All of the demo Virtual Sets that ship with NextStage were pre-baked using Cycles (Suzanne the monkey makes a cameo in the intro video).
I’ve been developing this application pretty much in a vacuum, but I’m very excited to finally get it out into the world. Please let me know if you have any questions, comments or concerns.