Importing textures in real time from a camera

Hello all, currently I’m working on a project (school related) in which my collaborators and I are trying to create a game demo with the Microsoft Kinect. NI mate has proven to be exactly what I need, since it does skeleton tracking and such.

For our project, we are trying to get an image of the environment and map it to a texture (we are trying to present the users environment on screen with 3d objects added, and the user replaced with a game model).

Is there a simple way to read in a texture in real time, or even simply use it as a background?

This realtime image could come from the kinect camera (currently using prime sense’s drivers) or possibly opencv

Any help, examples or advice would be useful!

I realize this is kind of a difficult question, so thanks in advance!

-Andrew Miller

You could try to do that with VideoTextures.
Search for Mirror or so, there are example files here.
This does exatly what you want: it renders an image and wraps it on a Plane or whatever

Okay, that may help, I’m really just trying to figure out where to start looking…

The texture needs to be rendered from an actual physical camera, or input from some other software, but I think that may help me get started

I will have to see if there is a way to use an externally rendered image rather than one from within blender.

Thanks!

I might be wrong because of unexperience with cameras, but:
I believe that the camera renders an image on hd, wich gets updatet all the time. So video Texture could load that image.
Then the difference between blender internal renders and a “real” camera setup would be the same!

Isnt there a thread about realtime cameras in blender game engine? something like head tracking or so …
Might be helpful maybe.

I have actually solved this problem! The camera doesn’t render to the harddrive, that would be way too slow for realtime video. The data is gathered directly by the program using it. I don’t exactly know how it works either, but writing a frame to the hard drive every time would be much too slow.

In order to set up a link to a webcam, you load the textures module in python (previously VideoTextures) and set the source to be (‘cam’, 0, 20, 640, 480) I believe the code can be found on this link: http://www.blender.org/documentation/blender_python_api_2_62_0/bge.texture.html

I found several demos from 2.49, so I had to change some code to match with some updates.

Now I’m going to attempt to install OpenCV and retrieve an image from the microsoft kinect, do some processing on it, and then display it as a texture.

I’ve spent all day trying to install it and get it to work with python… but have been having some issues… I’m going to try again tomorrow. Thanks for the help! I’ll keep you all posted.