Is writing addon for interfacing with 3d stylus possible?

3DSystems Touch promo

Does the current Python API allow this?

Sculpting programs with VR controllers for example shoot a virtual laser beam from the controller on the 3d model surface so VR controllers act like a laser pointer (example: https://www.youtube.com/watch?v=jnqFdSa5p7w ), but devices like the one in the above image actually allow you to “feel” the current shape and sculpt on it rather than working like a laser pointer because they provide real force feedback and haptics.

Blender allows to model and sculpt a 3d model with a 2d monitor and mouse/stylus by “shooting a virtual laser beam from the middle of the monitor” (or a collision ray to be technical) and the mouse/stylus is assumed to always be “on” the surface of the 3d model. So I’m wondering if the current API allows to bypass this behavior or design around it somehow for there to make sense in using a true haptic 3d stylus.

yes, it’s possible. and for the record- this is the python forum, so simplifying concepts like raycasting to “laser beams” for us isn’t really necessary :slight_smile:

technically a haptic device like the one in your picture is just another NDOF device, which blender already supports out of the box. raycasting from arbitrary points in space is already supported out of the box. sending haptic feedback to the device depends on whether or not the device has some way for the python API to communicate with the driver.

Hi and thanks.
Yes, the driver can stream positional data and receive haptics (collision, button press) data from Python scripts.

But here’s the issue. I couldn’t find what classes in the bpy library handle the NDOF devices and what are their features. The raycasting you mentioned is not enough as long as there isn’t a way to change how the Blender cursor works in the object, edit and sculpt modes with a 2d input device like mouse or stylus. What I mean is, mouse/stylus is assumed to always be “on” the surface of the 3d model in the edit or sculpt modes. In sculpt mode the “orientation” of the cursor is also fixed, I believe using the normals of the mesh the cursor ray is hitting.
To rephrase and give a sculpt mode specific example, the 3d haptics device assumes the tip of the device in the 3d world to be “on” or “inside” the mesh for the mesh to deform and for haptics data to be received. Such a device does not use raycasting from a tip like VR controllers.
Does this make sense? Is there a way around this?

Thanks.

Any more information if this is possible for Blender sculpting?