use bge + a few helper commands to have programmable python tools

this highlights that iterating over vertex to paint in python is slow,

itterating using a kdtree is fast, but to paint or sculpt in the bge, (and I think BPY as well) we need to itterate over the data returned by the kdtree in python and set the values which is slow,

I purpose someone add

KX_meshProxy.brushStroke( ‘TargetProp’, Equation, KDTree.find_range(co, radius))

where we could itterate over the vertex in C++ instead making it 25x faster

The rough part is there needs to almost by a equation API that only operates on data the brushstroke has
vertex.XYZ, vertex.color, vertex.normal, radius, co etc

with this we could potentially
paint vertex
paint pixel on uv mapped image

and potentially we could also mix these operations

we would need a method to save KX_meshProxy as meshes in .blend files as well*


it would great if it would allow people to cut holes too.

cut holes is simple as far as removing faces,
(set alpha zero or move vertex and set alpha etc)

splitting and adding faces is not possible in bge atm but tristan was talking about a mesh editing/creation api

vertex shader


uniform sampler2D IMG;
void main()

        gl_Position   =gl_ModelViewProjectionMatrix*gl_Vertex;


mesh vertex coord z from image.

my vertex shader.


vsh.blend (9.56 MB)

using a vertex shader you would have to handle terrain physics all by yourself.

How about sculpt draw subtract.And dyntopo funtionality.because you can add more vertexes with that.

@BPR’s initial post: I think it would still be too slow. it may be fine for small maps and vertex counts but I feel python is just too slow to do this with any higher complexity…there are better tools for the job integrated via C …I’m not really sure why you bring this up TBH? That does not mean it is not important…I just do not see it I guess.


calling a compiled func in py is fast,
it’s interpertering the lines that are slow,
1 line triggering a threaded c++ or c command can be blazing fast.

calling 1 command and dumping a bucket of data in it is reliant on c/c++ after you dump the data / pointers in.

what do you think blender uses to paint vertex or select faces under the hood?

kdtree, bvhtree and soon pbvhtree

I did not mean that python was slow…I think that is a matter of what you are doing, but I also did not read well enough through to get that you wanted it as a precomp func…so my bad, I apologize for not fully reading…I merely made an ass-umption.

Moved from “General Forums > Blender and CG Discussions” to “Game Engine > Game Engine Support and Discussion”

the idea here was to add a helper command to write game objects back to disk as .blend

kdtree = kdtree.fromGameMesh(own.meshes[0])
math = (how to produce math c can read with py?)
KX_meshProxy.brushStroke(math, kdtree.kd.find_range(brushloc, radius) )

we just need to figure out how to write the ‘math’ api so it’s adding a little features all the time

initally lerp value using distance from brush center and radius can do a ton (like I paint in example and sculpt)

pos.z = cos(worldPosition.x+time) = real wave function

Still heavily related to the BGE.

bpy already allow creating and modifying your meshes.

  1. try and sculpt in py with a kdtree in bpy
    (same issues)

return is fast, it’s operating on 1000s of vertex returned by kdtree that is slow
(removing half the bottleneck)

the final bottleneck is the application of SIMD
(single instruction multiple data)
using whatever means is fastest.

  1. if we had these in the ge we can do amazing things in game and save them out of game. (removing this itteration leaves more room for reinstance physics mesh and recalc normals)

sculpt, paint, dynamic destruction, spray painting in game, tire tracks…

persistence to destruction,

btw I told you panzergame was working on a mesh editing api in upbge
(for creating new meshes and adding and removing faces in game etc)

There is a difference between mesh creation/editing API and making your very own sculpt tools in the BGE. Now it can be cool, but I’m not hyped by the fact of modifying complex mesh in realtime continuously.

I would rather see a game where you generate physics mesh or so using the said API, but I’m not really fond of the idea of constantly re-updating a mesh… Not that you shouldn’t, but it seems a complex use case, that one should only rarely use.

waves in water?

wind effects on trees with collison?

accelerated pythonic animation using empties as bones?

skinning virtual particles?

‘tron trails’?

all sorts of things

Currently they are made through shaders, and by the nature of water, you don’t use regular physics to move objects around. So most of the time you have to apply forces yourself by using the same formula as in the shader to sync you game entities with the GPU render.

Taking BF4 as an example, they don’t update the physics, because hell, there is usually a lot of trees in a scene, and updating everything is insane. So instead they have neat vertex shaders again, and the animation is cool but factice.


I don’t get it. edit: why just not using bones ? want your own deform algorithm ?

I don’t get it.

Indeed, that would be interesting. Currently we can use panes tho.

I agree that a full mesh editing API allow us to do a lot of things, but you can get away with a great game without rebatching the physics of everything.

The idea with your own bones, is for objects made of constraints like rigid body ragdolls using joints.

the ragdoll bone physics objects could be the armature bones basically

vert.pos =bone.XYZ.lerp(ToLocal( bone.worldTransform * offsetFromGroupOrigin * scale),weight)

done in order up the ik chain?

one could suddenly add bones basically to bend a rod or?