Multi-touch large screen with blender game engine

Hi,
I would like to show our recent projects using multitouch technologies, some interactive works but all the moon 3D world is running with Blender engine.

Here is the video :

http://www.youtube.com/watch?v=zBWOEq70oms&hd=1

I would like to thanks the community for the use of some opensource apps :

  • Community Core Vision

  • Blender (3D creations)

  • ModestMap app developper

  • MMA Pro (Multigesture.net)

    The 2m x 1m interactive wall we are using is still a prototype, I will post new videos as soon as we have something finished !

    Thanks !
    Rudy Morin - Virtual-IT

Here some screenshots :

http://www.virtual-it.fr/lunar/appolo.jpg

http://www.virtual-it.fr/lunar/extracteur.jpg

Looks impressive!
What exactly is the interaction in the moon 3D world, and what kind of bge sensors you are using for this multi-touch screen technology.
Great project you have there. :wink:

Damm, this looks awesome!

Hi, thanks ! :slight_smile:

We have split the screen in three parts ( 3 viewports ) in which we can put different cams.
For each cam, there is an interaction :

  • Zoom in, zoom out with 2 hands
  • Pan left, pan right with one hand
  • Pan up and down
  • Multiple users can touch its own viewport in the same time

It’s a demo and we could imagine other moves but we wanted it simple and intuitive !

The main vehicule can be controlled by :

  • An ipod / iphone
  • A wireless computer with touchscreen (the map system)
  • A wiimote and more

For the sensors, we use infrared techniques (diffuse illumination in this case) as used in this kind of Multitouch technology

Best Regards,
Rudy Morin

Man, that looks cool, and that it’s touch controlled is awesome.

You have the Blender Game Engine working with the I-Pod touch and the I-phone?

Yes,
The Blender Engine with our udp server can be controlled by any iphone or ipodTouch after installing the appropriate app !

Hope you saw our video, you can see it in action !

++
Rudy Morin

Looks great!

Just to confirm though, you haven’t configured Blender to be multi-touch, but another multi-touch app / screen is sending signals to the Blender screen to control it?

Either way, it is a very impressive graphical demo of Blender in use within a commercial / research department - nice work!
Mal

Hi !

Yes we use the great CCV app (Community Core Vision) to get touch datas.
The java gateway receive the osc datas from CCV and resend them into interpreted camera controls (with camera smooting and multiusers support) via UDP messages to Blender.

Here, some interesting examples from an other team :

http://forge.lifl.fr/PIRVI/wiki/MTUtils/blenderTUIO

I’m glad you enjoyed ! :eyebrowlift:
Rudy Morin