How does blender handle mouse clicks? (mluti-touch interface)

Sorry if this is a bad question, but as I don’t know as much as I’d like about programing myself I thought I’d ask and see if anybody knew. The reason I’m asking is I was reading over at blender.org and saw that for 2.5 they are going to recode how blender handles such basic operations and was wondering if it is or could be done in a way that is friendly to multi-point touch screen interfaces?

(A friend of mine was telling me how you could use a digital, web, or “special” camera to make fairly high resolution inferred sensor on a rear projected display, thus making software the only hangup to having a cheap multi-point touch screen. (hope that made sense) what do you knowledgeable guy’s (and gals) think about the blender side of using that?)

And on a side note, what about the head tracking that’s is starting to show up? I would think that is much harder to do programing wise. Just think of it, a head tracking touch driven interface…:eek: that would turn a few heads.

With the event rewrite, I think it should be trivial to add support for multitouch interfaces, in the same way you could then add support for ie. manymouse.

You’re talking about projects like: http://multitouch.fieryferret.com/ And yes, I’ve thought about how this could be done in Blender.

You should hook up the input from the head-tracking device (cheap-to-make is IFR leds mounted on your head, and interpreted by IFR-capable webcam, which I understand are easy to come by) into Ghost much in the same way as NDOF devices have been (Space Navigator) not too long ago.

With the event rewrite you then could assign events generated by any of these external devices to their respective operators within the new event system so it does what you want to (ie. navigation by head-tracking in 3dview).

/Nathan

So, it sounds like it’s not only possible, but could be done fairly easily? Who is going to be in charge of the event rewrite?

Also came up with one more question, are there any plans for how the UI is going to be developed in terms of visually? Is it going to be just kind of the same old blender, or are there plans to make it much different? flexibility in the UI would be very important to a multi-point touch interface; Actually, I think it would be the whole point of it.

As long as I’m on a roll asking noob questions, where else can I learn this type of stuff? I was checking out blenderstorm thanks to JesterKing’s signature, is there another place that would be good to check?
Edit: don’t mind the last question, I just found all the stuff on blender.org.

I’m not in charge, but I’ll be sure working on it ( http://www.youtube.com/nletwory for some old blender 2.5 WIP videos).

Anyway, I’ll be buying a Wiimote soon, and build me some ir-led pens to be able to test this stuff with the new event system. For around 50-60 euros I should have myself a cool system to experiment with :smiley:

Visually it will change hardly at all - this rewrite is under the hoods and will as such not affect the GUI visuals a whole lot.