UI idea.... OOP pipeline

Idea = use bge game technology for 3D ui

abstract = Mouse cursor can be aimed at UI elements, when mousing over a UI elelment, a property within the each element sets the tone for how that UI element receives keypresses and potentially freezes cursor on element , then accepting mouse input to manipulate a tool.

the integration of bullet or another physics system in the UI would mean 1 system could do all of blender … UI , tools, viewport etc. and merge the code bases both ways. Edit models in bge, and use raycasts for ui, and not lock a UI to a 2d plane.

An individual could import a object that was a new ui test element, without any fanfare.

I got to thinking about this, because I do something similar in the game engine,
to swap actors or drive vehicles or drones.

controller(converts keypresses to list and mouse x,y and sends to target actor)

actor(receives list and does locomotion / animations / actions)

weapon = a container with code that is executed on setting a property in the object.

these could all represent a Ui input pipeline

controller —> Tool ->Tip

tips could be a material, or a brush, and be shared easily,

(just like my systems work in games to share weapons)
if you are not focusing on a tool, or it is not actively doing anything, its logic state is off…

this would also uncap blender for future augmented reality technology…
for example …touchscreen events, or 3d mice, 3d stylus, occulous view focus.

Blender has a lot of weaknesses that need to be dealt with first before we go into showy concepts like a 3D rendered UI.

Now modeling, UVmapping, texturing ect… using augmented reality goggles might be cool to see, but I would prefer the devs. first make sure that every part of Blender has been completely modernized.

Ehh, these are just ideas, I don’t particularly see the need for this response when its not like suddenly a Blender dev is going to decide to drop whatever they’re doing and implement this, and even then, unless they’re being employed by BF for specific things, they answer to no-one.

Must admit though, sounds interesting but i’m not really sure where you’re coming from with the idea or what the benefits are BluePrintRandom :3

1 system that controls all of blender,

imagine each tool is a stand alone piece of code, when targeting it and applying keypresses and mouse data, it acts, someone makes a new tool? just import it.

This makes it so all of the UI, and the viewport use the same code. (a unified code base)

here is the demo of using a Controller -> Actor- > Weapon

WASD = move

mouse left click = fire weapon if your holding it

CTRL = bring up mouse cursor

CTRL+Right click = pick up grab_able objects

CTRL + Right click while holding = place object or place object in slot

Right now, the property [‘target’] in MouseController and Keyboard Controller is set to Player,

it can be anything Car? Tool ?

I’ll make a demo soon of operating tools instead of actors…

idea part2 = UI widget objects

  1. a slider, by parenting a slider to a object and setting the property it manipulates and a range, one can control a tool variable by moving the slider

  2. GUI output textbox = displays a property as a string (coupled with slider it will show the value you are manipulating)

  3. Image box = displays a image or movie from a property (will need new type img)

  4. Drop down = Displays a list of strings, selecting a value sets another property (index)

So to make a tool you would grab a tool blank, drop code in it, configure its gui and it would be ready to go.

Attachments

WrectifiedCandy.blend (541 KB)