Blender as virtual studio application example brainstorm eStudio

Hi all!
Im new to this forum, but I have been playing around with blender (mostly modelling and set design) for a while now. I resently started working on a project to mimic the functionality of a software called eStudio by brainstorm.

Im a film student and i have done some projects at my school where we filmed live footage while combining a virtual set with actors by using eStudio. In my own project I whant to do the same thing with open source tools. Since blender allready has alot of the functionality I need for this, it has become the primary for this.

Heres the basic plan: I have a camera whitch is filming an actor infront of a greenscreen, then I have the background for the scene open in blender (here I can add all kinds of things like animations or static backgrounds rendered with the game engine). I then combine these two images by bringing them into Open Broadcaster Studio, and woilla! I have an actor moving in a virtual environment in real time.

This all works fine with a static camera but I have had some issues with a moving camera.

First of all there is the motion tracking of the real camera to the virtual camera. My school uses Infra red cameras to track motion but I cant afford them so now ive been trying to use a MPU6050 with an arduino to capture movement. This all works ok on objects but for some reason the camera allways stays static even when parented to an object or armature.

Capturing the the translational movement is also something i havent solved yet, the MPU only captures rotation movement.

I been floating around this idea to use something similar to the mouse look logic but instead of rotation it would controll movement.

A kinect might also be someting i will explore.

Has anyone of you tried something similar or played around with Internal motion capture.

I have a github for the MPUtest but I cant post links yet so if you whant to look at it just look for MPU6050Arduino-blender by user Bjaza.

Any toughts, ideas, advice or suggestions are very wellcome.

Thanks!

We really need a design for a cheap 3d printed exoskeleton that measures joint angles,
If you had this with a dead reckon sensor in the core, you could use the position / angle of the core + tbe angle of the joints to drive a actor armature in game.

Then one could act like the way they did to make james Cameron the avatar.

Another issue is force feedback :3

Most of these kind of projects that ive heard of are using the unreal engine, but as a linux user myself id rather use blender. Ive done some reading about the blenderVR project maby this could also provide some useful things. Did some more googling and found company called yost labs. They have some 3- space sensors and a plugin for blender.

The dead reckon sensor also sounds interresting. Ill have to read up on this…:slight_smile:

I managed to compile openCV with the python and numpy versions used by Blender 2.76b (python 3.4, numpy 1.9.1 I believe). And then you get access to the whole library of openCV with python scripting inside BGE. Probably it also has features for motion tracking.

I only got to tracking colored objects. Especially fast movement tends to be a blur on most camera’s and then color is the only robust & simple method I found for fast movement without any strict requirements on background. With a green screen you might place some boxes in view that you can use for position tracking …

Other thing is a fixed framerate is a must as requesting a frame before it is ready will really slow the logic. And if frame rate depends on lighting conditions this will have to be adapted real time. I just use a PS eye camera, with camera logic running 29.5 fps

Also you mentioned the camera is not rotating as expected inside the BGE if I understood correctly. What input are you giving BGE for that?

Right now i have the MPU6050 controlling a bone and i have tried:
parenting the camera to the bone.
parenting the camera to a cube that is parented to the bone.
and using the cube parented to the bone as a camera.
Every time the camera does not move.

Im currently using a modified script from the motiosuit project by Alvaro Ferran to import the MPU6050 data as quaternion. Im still learning python but getting there :slight_smile:

Ive managed to get around this problem by instead of moving the camera, Im moving the world around it :slight_smile:

I would post github link but Im not up to ten posts yet…

Oh,and the Arduino is loaded with a edited version of the mpu6050_dmp example code originaly made by Jeff Rowberg.to instead of sending /0.0/0.0/0.0/0.0/ it is sending 0.0, 0.0, 0.0, 0.0

Possibly need to add the run armature actuator to the armature. Maybe even a single animation to wake it up

I added the run armature actuator. still no movemet from the camera. Maby Im doing something wrong (very possible) but ill keep trying. The workaround still works ok for now :slight_smile: