Virtual Camera experiment, Kinect and Ni Mate Headtracking

been doing kinect tests recently and had this Idea of using hand movements to control a camera in blender,
it was supposed to imitate the process they did to shoot scenes at the movie Avatar.

Ive seen Ni mate for a while and decided to use it for the experiment,

Ni mate was used to control a rig in blender that drives
the camera’s position, angle and focal length.

here’s the rig set up I came up with


the rig is constrained to empties treating them as trackpoints,
these trackpoints are then controlled by a user through tracking his/her head and arms.

here are the possible controls that can be done to control the camera


the test went smoothly and now were planning on using it on our animated short,
finally here’s a small video showing it in action.

you can visit my blog to see the other tests ive made
here’s a link to my original blogpost

there’s a feature test there showing Johnny Lee’s amazing wii headtrack hack, but done with a kinect sensor.
this concept was actually inspired by it.

i hope you guys enjoyed the video. :yes:

This looks really sweet man, you should pitch it for an ad installation

This is sweet, but I always wanted to have something close to avatar’s technology:



Could be done that with a wii-mote plus and with an android tablet virtualizing a PC or something, that would be awesome.

I think I already saw someone did that with a tablet, couldnt find the link. he used real time camera tracking to achieve it.

@fernan yeah we were actually thinking about that, like one of those augmented reality installations with a projector.

ok I made another test, this one has an android phone involved as a display that’s running through a VNC client

This was actually accidental, one of life’s unspoken mysteries.

While I was browsing for apps @ the appstore I suddenly stumbled upon RDPs and VNC clients that allows you to control your PC through internet or a wireless network. Ive seen them on PCs but I never really thought theyd exist on a smartphone, It works great with the kinect setup.

So there you have it, James Cameron’s Virtual Camera on a phone. :wink:

So could you record the room that you’re in with the phone’s camera, while at the same time recording the virtual camera movement in Blender, thus negating the need for the normal camera tracking? Is that the goal of this type of thing? Forgive my ignorance of the subject.

Steve S

Hi Steve, The initial goal if these tests were to create methods for rapid animation.
Im not using the phone’s camera Im just using the phone as a display, the kinect sensor is the one doing all the job tracking my orientation.

I just made a camera for this check it out


you can find more info about it on my blog>> http://marwinportugal.wordpress.com/?p=217&preview=true

video please!! looks awesome :slight_smile:

ill be testing it out today ill post you guys some vids, hope it turns out well. the camera is kinda huge and im thinking i might have some problems with occlusion, the sensor might not recognize my body parts with it in front of me.

here’s the camera UI


sorry no video yet, havent tested it out with the kinect yet,
wasnt able to go to the office.

anyways the controls works :smiley:

this is actually version 1.2, i added a mouse wheel to the unit that can control the the
camera’s focal length, really cool ill post some pics soon of the changes ive made

camera’s focal lens ranges from 11 to 150 mm, searched google for standard used lens for video recording and this is what it gave me.