Idea = use bge game technology for 3D ui
abstract = Mouse cursor can be aimed at UI elements, when mousing over a UI elelment, a property within the each element sets the tone for how that UI element receives keypresses and potentially freezes cursor on element , then accepting mouse input to manipulate a tool.
the integration of bullet or another physics system in the UI would mean 1 system could do all of blender … UI , tools, viewport etc. and merge the code bases both ways. Edit models in bge, and use raycasts for ui, and not lock a UI to a 2d plane.
An individual could import a object that was a new ui test element, without any fanfare.
I got to thinking about this, because I do something similar in the game engine,
to swap actors or drive vehicles or drones.
controller(converts keypresses to list and mouse x,y and sends to target actor)
actor(receives list and does locomotion / animations / actions)
weapon = a container with code that is executed on setting a property in the object.
these could all represent a Ui input pipeline
controller —> Tool ->Tip
tips could be a material, or a brush, and be shared easily,
(just like my systems work in games to share weapons)
if you are not focusing on a tool, or it is not actively doing anything, its logic state is off…
this would also uncap blender for future augmented reality technology…
for example …touchscreen events, or 3d mice, 3d stylus, occulous view focus.