can I pick a rigid ball with a hand using physics?

Hi all,

So, I have developed a virtual hand that’s controlled by a glove I made. I was planning to use the hand to pick up rigid objects using game engine physics, but so far I have been unsuccessful, because every time I try to pick an object it slips away and I can’t grasp it. So I am thinking of learning some other way of doing this, as I saw some other games using parenting and other methods to enable a character to pick something up. But I would prefer to still do it using physics as I was planning to send some feedback from the virtual hand to the glove later on and had planned something based on physics.

So I was wondering if, anyone can take a look at my blend file and let me know if I am doing something wrong with regards to physics, or if there is way of picking up the rigid object by the hand using physics?.



hand.blend (431 KB)

Turn on “Show Physics visualization” in the “Game” menu at the top of the screen. You will see that only the visual mesh changes with the actions, and not the physics mesh. I don’t know if it is possible to use python to get the physics mesh to change in real time.
Parenting is probably your best bet.

It is likely that you need separate objects for the fingers. That makes it easier for bullet to calculate the collisions.

here is a quick Demo:


grab.blend (151 KB)

Thank you Monster & John316 for your answers. I didn’t know about the physics Mesh before.

The method I was using to move the hand mesh was by assigning the values coming from wiimote (6dof tracking), to the “hand” bone (using python). However, after I realized there is a physics mesh, I noticed that the physics mesh is not moving at all. Then I tried parenting a cube to the armature and moving the cube (again using python) using the data coming from wiimote. This way the physics mesh moves but the compound mesh now becomes huge and out of control, and its motion (6dof tracking) becomes inaccurate. Lastly I tried to move the hand bone, similar to the original approach, but create “finger tip follower” objects that follow the finger tips of the hand but are separate objects with no parents. But the followers don’t follow the finger tips properly.

This is becoming very frustrating, I would greatly appreciate it if someone could offer me a solution. “cube” and “finger tip follower” approaches are uploaded below as cube_approach.blend and approach.blend.

If you change the finger tip followers from dynamic to static it seems to work (Except for the thumb). I don’t have a wiimote that I can plug in to my computer to test it with.:frowning:

Edit: the “thumb_tip” is not centered with its geometry. Select it and go the the editing panel (f9) and click “Center New”

I Think I just found away to make it work! please wait…

I added an always sensor, to the hand mesh, with true level triggering, so that it constantly repeats, and hooked that to a “Replace physics Mesh” actuator and the physics mesh bends with the armature!
Note that the hand can not be a convex hull or be an “add to parent” else it will not update.
Here are the fixed* blends. The first one is really slow because of the heavy geometry of the whole hand being updated every frame. If you change where it says “0” to “1” in the always sensor it will run much faster. You might want to do this for the fingertips in the other file as well.
I also added a “x” key sensor to the
armature just so I could test it.
Oh. And I think that you can remove that and its controllers from the second one since I removed those objects.


hand 2.blend (140 KB)finger_tip_follower_approach 2.blend (147 KB)

Thank you John. That’s very cool, the physics mesh now follows the display mesh in real-time. but why is it that when I separate the tip of index finger and make the rest of the hand, “no collision”, the tip doesn’t collide with the objects?


hand 3.blend (439 KB)

Also in the logic; click advanced and set the limits of the margin. Under the game tab at top of screen, enable game physics to visualize what’s going wrong.

That is strange :confused: … I tried everything I could think of and nothing worked. :spin:
If doing the hole hand is too processor intensive, then I would suggest doing what monster was talking about. Making invisible objects that mimic the movement of the hand, but are controlled by ipos and not by the armature.
Here I did the index finger. I will try to implment the movement into the GE, but it may take me a while to figure out the python ;).
Hope this helps.


hand invabounds.blend (143 KB)

Thanks a bunch John :). I’ll try to figure out how to use the ipos in the GE.

Hi guys, I get this topic on its way and I’m really interested as I’m trying to do exactly the same thing!
So first of all thanks for the blend files and explanations.
I still miss a thing: when I launch the Hand 2.blend file the Actuator attached to the hand mesh is not “Replace Physics Mesh” as expected but only “Replace Mesh”; and f I check it out in Game mode it happens that only the original mesh moves, not the Physics one.
BTW I tried to do couple of tests with the Hand 3.blend file and nothing happens when the fingers touch the ball.
So what I’m asking myself is where is the point I missed? is there a version problem (I’m working on 2.49) or a man problem :slight_smile:

Many thanks by advance, I hope I can give this help back one day.

Just downloaded hand2.blend and it works for me. (It should I uploaded it.) Are you using 2.49b? That is what I am using.
Yes it is set to “Replace Mesh” but notice that a button under it that says “Phys” is activated. If you do not see this “Phys” button then you are probably not using 2.49b.
Also note that this method of reloading the physics mesh at 60 times per second is heavy on the CPU.
The solution to the finger tip approach (hand3.blend) was on another thread.
(See last post)
I would be happy to help if you have any further questions. :slight_smile:

Hi John316 and thanks for the quick answer!

Obviously I didn’t have the coorect version, this morning I installed the 2.49b and here is the button :slight_smile:
So I tested the Hand2. blend file and yes it works; so does the odd fixed.blend file you gave in the other topic.
I studied it to understand how does it work and how I could reproduce/adapt these tips to my problem, and until now it still doesn’t match…
Here is a sample file with a simple armature controlled mesh, trying to smash boxes; please be kind, it’s my first BGE test.
The thing is that the top of the mesh keeps standing in the start position as the rest of it moves, so we get a distortion; and overall the collision doesn’t make the boxes fall as one could expect :spin:
I tried to copy/adapt the settings from Hand 2 but I think I missed something.

(BTW thank you veeeeery much for your help, this is done for job and yesterday I was feeling very bad about saying my boss there was no way to make it work. Be sure I’ll credit you and Blender Artist Forum for this :yes:)


collisions.blend (239 KB)

Turn all objects to “No Sleeping” that fixes it!
(This may fix some older problems too! Thanks for asking! :slight_smile: )

I tried that and yes, the boxes got smashed! But as I still had the mesh deformation problem with my “putter” I did the skinning again and now it’s fine.
I made some tests and it seems that actually the only necessary object to be put in No Sleeping is the “putter”. If somebody wants to test it by himself here’s the blend file.

Thanks again John316 (and siamak123123!), I attach a glimpse of my job too (I started to put some virtual fingertips :slight_smile: )


fixed collisions.blend (241 KB)

WOW! That hand look great! :yes:
About that file though, if you wait a few seconds, until the boxes fall a sleep, then the putter just goes right through them and does not knock them down. It is probably best to set those to no sleeping as well. :wink:
Good luck with your project!

Yep, you’re right :slight_smile:
After a few tests and developments I have a skeleton with fingertips which can grab stuff, that’s already good news! Now my problem is ('cause there’s ALWAYS a problem) that I can’t actually “pick up” things with only thumb and first digit; objects act like soap and get out of the grip as soon as they can. I suppose it has something to do with friction so I tried and tried, but no way to fix it up!
I also observed that the objects were pushed before the respective bounds of the fingers and the objects actually touched; is there an easy explanation about that?
I’m sorry but I can’t put the blend file this time :frowning:
So here is a printscreen and I have to add that I set the fingertip margin to 1.

The hand is in the limit position; if I get closer to the cube it acts as if collided.

I switched the fingertips to Softbodies (and not Dynamics as they were before) and switched on the optionnal Cluster Collisions with other Softs and Rigid; by trial and error I set the margin to 0.4 and it’s not perfect but better.
Now my main pb is to drag things with friction only!


The reason for this soap effect is because the finger tips do not actually move, they “warp” so to speak. When updating a mesh deformation caused by an action blender does not take the change in position as a movement. Instead the object simply disappears from its old location and reappears at its new location.
What you can do is make it so that the hand only grabs the object using an action and have the lift motion caused by ether a motion actuator or by an ipo.
Now you should be able to use dynamic with friction, instead of softbodies, and will probably be less processor intensive than softbodies.
BTW a margin of 1 or even 0.4 is quite high. The default is 0.06. I often use 0.001. You can safely use 0.001 if the objects that the finger tips come into contact with have a volumetric boundary. (anything except triangle mesh, which is the default if nothing else is selected.)
Hope this helps you! :slight_smile:

You have no idea :o
I tried your setup at once and there we go, I can catch my red cube and pull it over, then release it with a perfectly satisfying natural behaviour.
Until now I was moving the whole hand with game command (keyboard zsqd) and use preset actions to catch the objects and drag them; the deletion of the dragging part of my action and its replacement with a “manual” command made it work.

Many many thanks for that!

I noticed the Logic takes a lot of CPU to up date the physics mesh. On my computer it takes ~20% the whole time the file is running! :eek: This would use valuable resources that could be better used for other things. So I made a test file that only updates the mesh when the action is playing. Now when the actions are not playing the logic only takes ~0.1%. :eyebrowlift2:
If you are using a property to play the action then all you have to do is add a property sensor, with true level triggering, that senses when that property changes and connect that to the replace mesh on the appropriate fingertip. (You can cross connect from one object to another if you select both.) Else you can add a FrameProp and sense when that changes. See attached demo.
I think I will make a video tutorial on this soon.
Just a thought: Is it possible to use python to make the physics mesh update in a more efficient manner? Maybe make it use the same mesh as the visual mesh? And would it be able to calculate actions as vector movements instead of just “warping”?


low_CPU.blend (173 KB)

once again you’re right, considering that you move one fingertip at once; if you move everything at the same time, the only moment you can lighten CPU load is during “transportation phase”. But it’s still good improvement :slight_smile:
For the mesh thing I played a bit with the Gfx and Phys buttons in the replace Mesh actuator and didn’t find a better way to do than what you said previously. Maybe in 2.5?
Now I didn’t go into Python coding yet… :wink: