Need help with vertex smoothing.

I am tasked with creating an effect which resembles something being “pushed through sand”. I used dot product to create the bump in the front like the sand is piling up and a trough behind it. My issue is when the mouse moves too fast, ridges occur in the trough because they can’t be calculated really quickly. How could I smooth out those ridges? I was thinking either capping the mouse speed or somehow smoothing out the vertices in the trough. How could this be done?

Gif to demonstrate:

Nice poly-pushing…

Store in a texture where the mouse has been (image.plot, similar as per here), and then you can use the mouse position dot product for the bump and a texture sample for the trough. Because the texture you plot can be circular, you can tolerate much higher movement speeds. If you can rotate the image before you plot it (eg store several pre-rotated ones because rotating in real time would be a pain), then you could have a relatively long trail, and even do the mound as part of the texture sample - relying on it being overwritten the next plot along.
It also means you can feed it into a vertex shader so you aren’t running into CPU limits like you are now.

To be more help I’d need to know how the effect is currently being achieved.

I’ll give that texture thing a go. Here is also my script if you were interested:


def drag(cont):
    
    own = cont.owner
    
    mesh = own.meshes[0]
    
    click = cont.sensors["rClick"]
    over = cont.sensors["Over"]
    
    hitPos = over.hitPosition
    
    if "init" not in own:
        own["init"] = True
        own["prevPos"] = None


    prevPos = own["prevPos"]
    
    if click.positive:
        
        if hitPos is not None:
            
            if prevPos is not None:
                
                vec = prevPos - hitPos
                mag = vec.magnitude
                
                m = hitPos
            
                for i in range(mesh.getVertexArrayLength(0)):
                
                    v = mesh.getVertex(0, i)
                    dist = math.sqrt((v.x-m.x)**2+(v.y-m.y)**2)
                    
                    if dist < 1.25:
                        
                        heading = hitPos - v.XYZ
                        heading = heading.normalized()
                        dot = heading.dot(vec)
                        
                        v.XYZ = [v.x, v.y, v.z+min(0.15, max(-0.25, (2/dist)*dot*(mag*10)**4))]
                        
                    
            own["prevPos"] = hitPos


                
    if click.status == 3:
        own.reinstancePhysicsMesh(own, mesh)
        recalc_normals(mesh)

Wouldn’t it be much smoother to focus on the path since last frame rather than the current location?

I mean even when you manage to move your mouse cursor across the whole screen you would get a gap-free line.

It is more complex to deal with a line rather than a point, but it would be much more precise.

Ok, so I got the texture painting thing set up in me scene but I am naive to the process of how I would get the vertices within the texture in order to displace them.

From python:

def get_closest_pixel(img, x, y):
    '''x and y should be floats from 0 to 1'''
    
    def get_id(x, y, img):
        size = img.size[0]
        return math.floor(x*size)*4 + math.floor(y*size)*4*size
    size = img.size[0]
    
    id_num = get_id(x,y,img)
        
    rgba = img.image[id_num:id_num+4]
    return rgba

where img is something like a bge.texture.ImageFFmpeg(IMAGE_PATH) or some other tex.source.

Or do it from a vertex shader with something like:

uniform sampler2D diplacementMap;
uniform float verticalScale;


void main(){
    //Displacement map:
    gl_TexCoord[0] = gl_MultiTexCoord0;
    vec4 v = gl_Vertex;
    vec4 tex_at_loc = texture2D(diplacementMap, gl_TexCoord[0].xy);
    v.z = tex_at_loc.r * verticalScale;  // use red channel for height


    gl_Position = gl_ModelViewProjectionMatrix * v;
}

I also forgot to mention that I’m getting a strange issue when painting in the x direction but not in the y.


	x = int(pos.x * IMAGE_SIZE/16 + IMAGE_SIZE/2 - TRACK_SIZE/2)
	y = int(pos.y * IMAGE_SIZE/16 + IMAGE_SIZE/2 - TRACK_SIZE/2)

both x and y are the same and the plane has applied scale but x is offset for some reason. Is there a clean way to fix it? I did change some of the numbers and got it to work nearly perfectly but it was dirty.

https://image.ibb.co/kKZHnF/gif.gif

game.blend (1.91 MB)

Cast a ray with polyproxy enabled and use the hitUV. Make sure the mapping of the texture is set to UV. That way you don’t have to do the conversion of plane size and you know everything lines up.

mathutils has a number of functions to compute uv etc

https://docs.blender.org/api/blender_python_api_current/mathutils.geometry.html#mathutils.geometry.barycentric_transform