Logic Brick request, Draw line, poly, n-gon or triangle from any X vert set,

So, I would like to be able to…

Set up 1 to X control faces on one object,

set up 1 toX on another,

label them some how,

and then have it “fill in the shape” with a cube, or lines or ??
no matter how the two pieces move,

is this even possible to code?

this would make some situations very easy, ( like blowing a bubble by scaling centers of empties)
or much more nifty things anyone can think of,

and can a wire get physics like a wire?

If you want to create and display arbitrary geometry, you can look into using PyOpenGL in a python script. You could also look into using shape key animations, though that could get a little tricky as the number of verts has to stay the same.

I would love to, but this could also be very usable for newbs(like me) if we set up a logic brick

I really do think in pictures…

Think, my first laser gun,

my first rope casting grappling hook

my first lightning bolt

etc

I just came across a thread on the Unity forums where someone is coding a modeler and UV-unwrapper inside of that program. It would be pretty neat if the BGE came with its own set of geometry building functions that can be assigned UV coordinates and physics properties/logic bricks (which would be a dream for some because it would allow free-form level editors and full-blown procedural generation and logic).

In all, something to consider, though there are various items for the BGE that are clearly of a higher priority (like multi-threading support).

@Kupoman,

Did I understood correctly: that you can you generate geometry on a fly using PyOpenGL on BGE? Is there any example that i could look into?

Thanks

render.drawLine()

I know already, and is neat, but I was thinking for total newbs, a “point A” and “point B”
Line-Poly or Vert pattern would be cool,

Why are vertex groups not supported?

Because PyOpenGL only recently gained some py3k support, you have only recently been able to do this. This means that there are very few examples of this floating around at the moment. But basically if you start issuing draw commands in a python script with PyOpenGL, it will start drawing to the BGE’s context. Conveniently the depth buffer is still valid at that point so you can even mix things in with the rendered scene.