Left click sensors on touchscreens issue

I’ve been having troubles getting my Blender game engine simulations from receiving left-click pulses when initially touching the touchscreen. To test it, I set up a simple scene with a cube that rotated on left-click. When putting my finger in the screen, the cursor moves to the spot under your finger but the cube doesn’t move. Then I move my finger slightly and the cube starts spinning.

I thought it was just my Dell multi-touch screen that was the minority but just experienced the same issue using the Wiimote Interactive Whiteboard. It really hampers the touch-screen experience when you have to drag over anything thats interactive.

Is there a python solution to this that I can’t see yet? Maybe a way to access the raw input data from the touch-screen? If so, how can it be done?

Another solution would be to add another logic brick sensor designed for touchscreens, but I have had no luck finding a custom build of Blender that comes with one. I guess I could learn how to program one in C++, but I’m hoping there is a simpler solution that doesn’t involve learning another programming language. Does anyone know of a custom build of Blender that can do this?

Any help would be greatly appreciated!

Cheers,
Dave

So you want to be able to touch, when you do that the cube moves to where the touch was recordered, then dragging finger/stylus would make the cube follow?

Nothing that complicated, no. I just want this setup to work:

Mouse-over+Mouse-left > AND > Send message: “Pressed!”

What I’m saying is just pressing the screen doesn’t cause a mouse-left pulse. You have to press and drag before Blender recognises the mouse+left.

Ah, after reading your last post then reading over your first post I’m sure I understand your problem now, when you touch the screen (interactive object) you expect that object to do something but you need to click and then drag your finger/stylus a small amount for the mouse-over sensor to send a pulse, which doesn’t happen using a traditional mouse?

I don’t have a touch screen to test anything, but I’ll see if I can come up with a way to get around your problem and upload a blend shortly. I’m thinking it may be possible to cast a ray in the direction of the click/touch, then removing the mouse-over sensor, you could use a small amount of logic to see what object was hit by the ray when the mouse-click sensor is activated, if its an interactive object then you would go ahead and activate the actuators for the result. It may work.

EDIT: As a side note to this thread, I think a touch screen sensor with options such as clicked, dragged and custom gestures could be something worth implementing.

Thanks for your help Hendore!

The mouse-over is working fine, it’s the left-click (touch on screen) that isn’t getting registered until dragging. Blender can tell when I’ve put the cursor over the object.

I’d love a touch screen sensor like that but there are so few people with the skills/time to do it. I’ve seen a project called ‘Tuio’ where the creators have made their own multi-touch sensor. It allowed them to deal with multiple fingers at a time. Sadly I couldn’t get any of their demos/tutorials to work and they aren’t willing to upload the plugin.

I actually remember watching some videos of Tuio on YouTube a few weeks ago, very impressive. From what you have described, I’m guessing when you touch the screen, all thats really happening is the cursor/mouse is being moved to that location but no click event is being handled/sent which is why blender doesn’t pick up on it. In that case, to avoid the need for dragging, couldn’t you just use a mouse-over sensor on it’s own without a click event.

Mouse-Over -> Controller -> Actuator

not
Mouse-Over + Mouse-Click -> Controller -> Actuator

That way, when you touch the screen (over an object), the Mouse-Over sensor will send its positive pulse to the controller which isn’t relying on other sensors to be true. The only problem I see with this however, is when you lift off the touch screen, the mouse position wouldn’t have changed so the Mouse-Over sensor will still be firing away. I wish I had access to a touch screen to test this out.

Hendore, you’re a genius! That solution works perfectly for the buttons I have in my simulation. Currently my program uses the standard keyboard/mouse interface and I’m adapting it to use touchscreen. I naturally assumed the need for a left-click but of course the cursor isn’t getting pulled across the screen like it would with the traditional mouse. Any objects that can be dragged could be activated using a mouse-over and deactivated using a negative left-click.

Thanks very much for your help!

I wonder if this problem persists with a Wacom tablet. :confused: hmmm… I think I’ll test it.

Edit:

The only problem I see with this however, is when you lift off the touch screen, the mouse position wouldn’t have changed so the Mouse-Over sensor will still be firing away. I wish I had access to a touch screen to test this out.
This is true with a wacom tablet.

Edit1:

Mouse-Over + Mouse-Click -> Controller -> Actuator
This however works perfectly fine, again with a wacom tablet.

Edit2:

Any objects that can be dragged could be activated using a mouse-over and deactivated using a negative left-click.

But if you have a grid of items? lets say (3x3) and you want the item that’s in (2,2) you’d have to carefully navigate inbetween? XD bah nevermind :stuck_out_tongue:

Edit1:
Quote:
Mouse-Over + Mouse-Click -> Controller -> Actuator
This however works perfectly fine, again with a wacom tablet.

Thanks for trying it out! It’s handy to know which touchscreens do/don’t need the fix, although Hendores solution should neatly work for all of them. I’ve found the mouse-movement sensor can also provide useful pulses similar to the left-click; it pulses once on initial press, and once on removal of finger. It also pulses constantly as the finger is moving.

But if you have a grid of items? lets say (3x3) and you want the item that’s in (2,2) you’d have to carefully navigate inbetween? XD bah nevermind :stuck_out_tongue:

Easy (: Have a GameLogic variable to track which one is active.


if mouseOver.positive and GameLogic.activeObj == -1:
   GameLogic.activeObj = own.index
 
if not mouseLeft.positive and GameLogic.activeObj == own.index:
   GameLogic.activeObj = -1
 
if GameLogic.activeObj == own.index:
   # Insert code to drag'n'drop
   # I like to use a GameLogic.variable set from an object following the mouse, I can post that code if anyone would like to see it