Mocap Data from MPU6050 gyro

Hi,
I’ve created a mocap glove (using ESP32 and MPU6050). I noticed my data is in global space. The hand in the metarig is local and dependent on a chain of previous bones. So my problem is that the sensor collecting my data is on the end of the fingers ( and on the hand , 6 total) and I gather data from each sensor which orientates itself to [0, 0, 0].

If there is a way to convert the global space rotation to the pose space I think I can expand this into a full body mocap suit.

I did find some 10 year old reference on how to create an empty where the bone is that has the values of of the bone in global space, but I need something to do the reverse where my global values can be applied to the finger tip and work in the pose space.

Any suggestions would be greatly appreciated.

1 Like

Cool stuff.

The easiest way and most boring way is to generate constraints for the bones programmatically.

import bpy

bone = bpy.data.objects['Armature'].pose.bones['Bone']
target = bpy.data.objects['Empty']

print('constraints', len(bone.constraints))

con = bone.constraints.new(type='COPY_LOCATION')
con.name = bone.name + '-to-' + target.name
con.target_space = 'LOCAL'
con.target = target

Some other resources if you want to try something:

https://docs.blender.org/api/current/bpy.types.Bone.html?highlight=matrix#bpy.types.Bone.convert_local_to_pose
https://github.com/igelbox/blender-retarget/blob/master/animation_retarget/core.py#L402

Thanks, I’ll give it a try. I currently tried copy location through the interface but it seems to be all over the place. For testing, I made a “T” object and labeled the sides x,y,z so I can identify the movement then assigned the data recorded from one sensor to that object. It responds as expected. So I then made two bones and copied the rotation to the second bone. It rotates as expected, but when I switch the axis to get it in the right orientation, everything breaks and the bone spins around jerky and unnatural. Maybe the “LOCAL” space property will sort things out.

Here’s a short video that shows the situation I’m trying to resolve. Keep in mind, I’ve tried all the simple things like rotating 90" or swapping axis. I’m looking for some way to apply my “world space” data that has an arbitrary orientation to my bones.

Yeah if you consider that each armature has any random bone orientation, it would result into an unlimited set of problems. The best idea is to re-oriented the bones forcefully into your own preferred orientation (SHIFT-N) as a safety measure so you get the orientations based on a configuration you have tested.

After doing some tests as well I noticed that indeed the problem you are facing with orientation is that copy pos/rot results into a raw value transfer. The childof constraint might be a better fit here, it allows you at least do any additional rotations on the object.

Left has child of and right has copy pos / rot.
The model on the left was facing the otherway by default (top of the head was at the other side). So I did an extra rotation 180 on the Y.
untitled-blendshot

P.S. The offer however is still on, for a proper system that handles irregular bone transforms perfectly. :slight_smile:

Thanks. Childof does help offset the bone without wrecking the original coordinates. However, I’m finding that my problem is gimbal lock and my gyroscope calibration needs to be more accurate. When I change the code back to Quaternion and fix my data source, I’ll post the results. Thanks for your help. It’s really helped isolate the problems.

I wonder if there is actually a really good project that tackles this subject? Such as for example something in Unity? If you have any ideas throw them in. :slight_smile:

Interesting. I got the idea from Unreal Engine. I was able to apply Quaternions and that solved a lot of the gimbal lock problems. I’m now trying to figure out how to use IK so that when I make a fist the finger bones fold as expected. Getting a lot of strange results.

You can record the matrix transform per bone from Unreal to a text file. Then supposedly when you load these matrices into Blender and recreate the pose of the bones, you would get the same results (perfect case).

The next problem is the bone alignment. At least if you replicate the exact bone alignment as the Unreal skeleton, you would be able to eliminate a great deal of confution. Instead of having made up alignment you would be compromised to a standard. At first is important to stick to a convention to discard the movable parts, but at a later time you would be able to change it if needed.

Only one thing left, that comes to my mind is if Blender and Unreal are either left/handed or right/handed. Just for the sake of reference I lookit up now. Looks like Blender is right handed and Unreal is left handed, but both have the Z up.

https://www.techarthub.com/a-practical-guide-to-unreal-engine-4s-coordinate-system/

So eventually if you flip the Matrix on the X you would be able to convert from one to another it nicely.

https://docs.blender.org/api/current/mathutils.html?highlight=matrix#mathutils.Matrix
https://docs.blender.org/api/current/mathutils.html?highlight=matrix#mathutils.Matrix.Scale

v = 1
num = 4
axis = (-1.0, 1.0, 1.0)
mathutils.Matrix.Scale(v, num, axis)

As you can see there is a lot going on here, but at least you can proceed with a strategy. At least I hope that something out of these works somehow.

Super cool. Thanks Tomorrow I have someone jumping in to help me iron out my IK set-up. If I make it far enough to test it out, I’m sure this will be useful.