W.I.P. multi-touch addon (Intuos/Windows only)

I’m working on an Python-only addon to support multi-touch for Wacom Intuos PTH-651 for Blender 2.9 on Windows 10.

It uses the WacomMT API. The addon uses ctypes to recreate the data structures, and register a callback function to read finger data.

This data is not fed into Blender’s event loop. Instead, I’m testing with a custom modal operator that is triggered at 100Hz from Blender’s own window manager. 100Hz is the frequency the Intuos uses.

The finger-reading callback Python function gets called by a different thread spawned by the tablet driver. From the callback, I push such data into a thread-safe queue. The modal operator pulls data from the queue every 10ms.

At this point, the operator can do what it please with the scene. Here is a video of a very short test (YouTube). In this, I’m modifying the pose rotation of last bone only when two fingers are detected.


The test operator now can move and rotate the current pose bone. I’ve posted another short video in YouTube. The IK modifier is on the middle bone.

In this video, the operator timer is set to 100Hz. In the previous one, it was left to 15Hz (or 30Hz, don’t remember).

The tablet lays on a desk book stand right in front of the monitor, so they are both in focus. The video has also been 2D stabilized with Blender.

The Python callback now moves finger data by a memmove out of the memory allocated by the tablet driver. The destination is a ctypes array, which is allocated on according to the variable number of fingers detected by the tablet. Some memory leaks and corruption are now fixed.

Overall, it feels now faster. The instance of Blender you see is compiled in debug mode and running with some debugging flags, so it should be slower than usual.

The finger numbers you see are simple text drawn in the post pixel phase. “1” is tracking my thumb, “2” is my index finger. The numbers are the order in which the fingers are detected.

1 Like

with pressure sensitivity one could move a object up and down
while XY are done with touch center point.

Very nice attempt, it looks simple and effective.

Though I don’t know exactly how the performance is because the Youtube video is shown at 24 frames (1hz = 1fps). I assume you get clean 100 fps right?

The Wacom driver spawns a thread inside the Blender process and 100Hz is how fast the Wacom driver can poll data from the tablet. My Blender modal operator is woken up at 100Hz. The two are not and can not be syncrhonized because Blender doesn’t support multithread on Operators. Because of this, there is a Python deque collecting fingers data at one end in the driver’s thread, and consuming fingers data at the other end in the modal operator.

On a side note, after the last video, I went into redesign everything because I would like to support multi-touch for different Blender areas/regions at the same time. For instance, move the timeline with one finger while recording changes in 3D view or grease pencil with other fingers. The one big modal operator implements now an asyncio event loop, so modal functions can be written as coroutines instead of being registered to Blender modal handlers. All of this should make writing modals code easier, because multi-touch is by the way a “paradigm shift” for how Blender operator works, which design is all based on keyboard, mouse and one object at a time.

Looks like a good approach, though I have not exactly a prior picture of useful is to work on multiple regions at the same time. At least moving the viewport and objects with gestures is a huge plus for me.