If it could be implemented like the new SDF editor from the OP, then it would be a great modeling tool, offering the possibility to blend and cut 3D shapes in real-time, along with automatic material adoption. I can also imagine it could be very well-integrated into Geometry Nodes to generate deforming models for animation / motion graphics.
Right now you can already use the Remesh modifier in Voxel mode, combined with a Boolean modifier, to achieve similar dynamically updating results, but there’s no automatic material / vertex color adoption in that, and no fine control over the shape blending, to name a few things.
I can totally see it being another modeling mode, besides edit and sculpt mode. Though, it would likely need to be destructive. Geometry Nodes intuitively seem a very good fit indeed!
When I watched the video, it somehow felt like precise Metaballs. It might offer another kind of flexibility that doesn’t exist in Blender yet.
Yes right, totally forgot about that, I never really used MS Paint 3D, disliked it far too much. But yeah, I see it as you, such a little roughout doodler like teddy could have been really nice in a more evolved state, so cool thats in zbrush now.
Just read a very interesting article from an Oculus Medium dev from 2017 and it turns out that SDFs have been an integral part of Medium since back then. Medium uses a very clever ‘Transvoxel’ system that dynamically translates voxel mesh triangles back and forth through signed distance fields.
They first developed the technique working on the Kelvinlets-based Move tool(Elastic Deform tech in Blender) The move tool in Medium is amazing, because it’s completely unrestricted 6DOF movement in a true 3D space(VR) Being able to move behind/over/around/back/forth while using your wrist action to twist and bend in 3D space really has to be experienced to be believed.