Would these work? - Cloth System, General Ideas

No script, just an idea for those far more experienced than me.
What I was thinking was this - now that we have mesh deflections and whatnot, how about using particles as a base for cloth simulation? It would run like this:

  1. User selects Mesh
  2. Script records position of vertices of mesh at start of script
  3. Script converts vertices into particles (basically free-floating vertices)
  4. Script converts another object into a deflector
  5. Script moves forward one frame, calculates new position of particles in relation to the Deflector mesh ( and some kind of user defined external force, like gravity or something at X.XXm/s in X direction relative to mesh)
  6. Script turns particles back into vertices and rejoins into mesh.
  7. Loop, continue.

Of course, some work would have to account for making sure the particles don’t move to far apart (shear and strech forces so that the Mesh don’t spill all over the place) and some work on the deflection, but I was thinking about it and it made sense to me.
Basically, the particles would ‘fall’ around the deflector and be converted into a mesh. Some additions would be the ability to ‘pin’ certain vertices (particles) to the deflector so that the cloth can be fixed to something and some sort of ‘Wind’ direction or something.

Anyone think this could be done? Anyone want to attempt it or give me some pointers as to how to do it (Obviously, my natural assumtion would be reference the object, collect vertex positions, whack in array, loop through and convert to particle and new position for each one, back to vertex, end loop, blah blah blah) I’ve programmed before but mainly PHP and JavaScript (gotta love being a web designer %| ) so I don’t know much Python, but I am willing to learn.

Ideas? Comments? Tomatoes?

python doesn’t have access to where particles are

besides, you could write collision stuff yourself and use the math described as used for ragdolls and cloth in hitman [1]

Ah, a definite hurdle indeed :-? %|
Oh well. I suspected as much - browsing the Python reference didn’t really make it specific as to whether or not you could reference an inidividual particle’s location. But, is there any way to access the deflector mesh via Python? If I can’t do it with particles, I could try and do it by manually assigning particle-like properties to vertices (which I assume it is possible to reference to individually in Python, yes?) and then maybe using the native properties of a deflector mesh as a basis - or simply defining a bunch of deflector-like properties on a normal mesh (on a face rather than vertex or edge basis) How about this then as a new script method:

  1. User selects a mesh that is to become cloth.
  2. Script is run and co-ordinates of selected mesh is arranged into an array. User-defined variables are collected (native Gravity, tensile strength, etc)
  3. Using another mesh as a base, the calculations are made. Edges are removed from vertices (to make calculation times smaller, although I’m not really sure if it’s neccessary), and each vertex is moved in X direction (native down) at Y (entered value for gravity) in one frame. Vertices closest to the opposing mesh are ‘attracted’ to it, thereby not spreading much and helping obtain a drape-like effect. Natural noise is accounted for.
  4. Vertices are then re-joined by edges. Edge length is stored in another array and length is preserved between vertices. Extra displacement may occur at this point, taking into consideration the user-defined ‘stretch’ of the cloth.
  5. Collision detection! Yay %| Edges that intersect another are moved outward, away from the deflector mesh, and down in the direction of the user-defined ‘gravity’ Depending on the user-defined ‘Cling’ of the material, it may end up far from the deflector mesh or very close to it.
  6. Loop, continue for (x) Frames.

Ideas? Would that work better?

Other considerations:

  1. The user may want to not only define a Cloth and Deflector mesh, but also a Floor mesh, so that the fabric will fan out and collect rather than just hang straight down.

  2. A firm/puffy cloth should have a natural ‘jiggle’ to it and should flex a miniscule, like a basic muscle system or something. This shouldn’t really obey the common laws of physics as the rest of the Cloth meshes - it should be treated as more of a solid mass. That, erm, jiggles.

  3. Of course, the above things could turn this cloth system into a skin/muscle simulator if you want.

Anything else? Am I on the right track here? Thanks for the linkage z3r0d.

Just more ideas now. Of course, these are just musings, and not at all researched, so regard these being more research and discussion than writ.

A) A Fluid System of some sort. I remember some kind of volume equation for solid objects in school last year - this being in my opinion the basis of any fluid calculation. Here’s the basis of it:

  1. The mesh that is to be turned into fluid is selected. Easy thus far, but goes down hill fast.
  2. Script is run. The first thing done is the calculation of the exact volume of the mesh and this is stored in some omnipotent variable. The script then checks to see if the mesh is inclosed within another mesh - like a glass or something.
    Another idea is that the initial object selected is a plane (as in the surface of the fluid) and then constructs a body from the mesh around it, uses the faces of that mesh as a base. Sounds hard.
  3. A set of universal X, Y and Z co-ordinates are then found (through the cursor maybe? Current view?) These are used for the direction the Fluid will…erm, ‘slosh’ in.
  4. The Fluid is parented to the Container and the Script is called when animation is done. Basically, the mesh will deform in the direction of motion, not intersecting with the Container, and deform in order to retain volume.
  5. The mesh will also be able to ‘bleed’ into a selected mesh, for things like absorption,

Sounds stupid I know, just rambling here.

Blah blah blah.
Crap crap crap :wink:

How about this: some basic architectural tools for Blender? Basically, you set the scale in the script so that 1 BU (Blender Unit) equals 1 WU (Whatever Unit - Inches, Feet, Centimetres, Metres, Astronomical Units) The script can then tell you how tall the selected object is, the physical area (of the total object or an individual face, or whatever) volume, etc. Simple but effective.

I also had another idea - another menu for Blender (You know, up the top - things like File, Render, Help - those things) that is called “Presets” or something like that. Basically, any scripts that either create or manipulate something can go there - things like Beast, MakeHuman, camera scripts, etc. We have a place for Import/Export Scripts, but not ones like these, so it would be nice to have. Alot of other 3D packages have menus like this, so it would bring more structure and appeal to Blender.

This isn’t really directly connected to this thread, but it could be, kind of, if you think about it:

The idea I had earlier was a script that worked like this: Select vertices and assign them to a certain, general muscle group (ie. upper arm, lower arm, upper leg, lower leg, etc…) Basically put ALL of the character’s vertices into the proper groups. The script would dictate how the muscles act upon each other when animated. Not any animation automation, really, just flexing & relaxing interactions between neighboring muscle groups. It would be cooler to have it for bipedal characters as well as multi-pedal creatures. Just basic bone-structures.

I have NO idea if there’s anything out there that could do this or how hard it would be to program something like that. I’m new to Blender and have never coded a THING in Python. :wink: I apologize if this is too off topic or WAY too off-base, just a passing thought I had today that would be very cool to have available. Sorry :stuck_out_tongue: