Currently using an ugly ram-drive hack, which i hope can be replaced by posix_ipc and directly setting the buffer (as soon as the bpy API supports directly setting a buffer). Source code is in the blogpost.
I tried to do something very similar to that (Getting OpenCV data into Blender2.5’s game engine). Let us know if you get the buffer approach working, it would be far better than the RAMdisk approach. How are you animating the texture in the viewport? force reload on each frame?
mpan3, a buffer should work for BGE, it already has support as far as i know, but i’m not working with the BGE so i haven’t tried it. Yes, forced reload.
I have code already in place using posix_ipc that will remove the need for the ram-drive, so its just a matter of time i think, but thats on Blender’s end because accepting a string buffer someone needs to modify the Blender Python API. I tried a workaround using OpenGL and the bindcode of the image, which appears to almost work, it does clear the current texture from the object, but seems like it won’t update. See code below.
goathead, I did use ARToolKit in blender, which was quite similar.
My way to do it was to use blender engine videotexture capabilities to grab images, then I passed the buffer to ARToolKit with the help of PIL conversion.
I also used python-opencv very much, but I didn’t display the video in blender, just used that for tracking, but from inside an engine script.
pildanovak, i’ve also used ARTK too bad its not maintained well anymore. The benefit of doing it like this - outside of the main blender process - is then opencv can run unblocked using its own core, and then as many FX as one wants can be used, even haar face detection. I haven’t really touched the BGE, but as far as i know, my script should be easy to modify to use the BGE if somebody wants to try hacking it.