I’ve been having troubles getting my Blender game engine simulations from receiving left-click pulses when initially touching the touchscreen. To test it, I set up a simple scene with a cube that rotated on left-click. When putting my finger in the screen, the cursor moves to the spot under your finger but the cube doesn’t move. Then I move my finger slightly and the cube starts spinning.
I thought it was just my Dell multi-touch screen that was the minority but just experienced the same issue using the Wiimote Interactive Whiteboard. It really hampers the touch-screen experience when you have to drag over anything thats interactive.
Is there a python solution to this that I can’t see yet? Maybe a way to access the raw input data from the touch-screen? If so, how can it be done?
Another solution would be to add another logic brick sensor designed for touchscreens, but I have had no luck finding a custom build of Blender that comes with one. I guess I could learn how to program one in C++, but I’m hoping there is a simpler solution that doesn’t involve learning another programming language. Does anyone know of a custom build of Blender that can do this?
Any help would be greatly appreciated!