Hi,
I would like to show our recent projects using multitouch technologies, some interactive works but all the moon 3D world is running with Blender engine.
Looks impressive!
What exactly is the interaction in the moon 3D world, and what kind of bge sensors you are using for this multi-touch screen technology.
Great project you have there.
Just to confirm though, you haven’t configured Blender to be multi-touch, but another multi-touch app / screen is sending signals to the Blender screen to control it?
Either way, it is a very impressive graphical demo of Blender in use within a commercial / research department - nice work!
Mal
Yes we use the great CCV app (Community Core Vision) to get touch datas.
The java gateway receive the osc datas from CCV and resend them into interpreted camera controls (with camera smooting and multiusers support) via UDP messages to Blender.
Here, some interesting examples from an other team :