Rotations with Quaternions and Relationships

I am trying to import animation data (position and rotation) from a .json-file for an object with different armatures / bones. In general it works fine and the position is red right (as far as I can see). The rotation is provided as Quaternions. In general it also looks like they are applied right… but with some armaturs (I have a humanoid object, so I see this especially in the fingers), the rotation is messed up and I have a weird result (even location might be a bit wrong).

I am wondering now what I do wrong and where I have to optimize the code. What I do is straightforward. I read a line of data from the .json file like this and assign it to a quaternion:

                rot_data = words_in_line[1].split(',')
                rot_data[0] = rot_data[0].replace('[','') 
                rot_data[3] = rot_data[3].replace(']','') 
                quat = (float(rot_data[0]), float(rot_data[1]), float(rot_data[2]), float(rot_data[3]))

bpy.data.objects[object_actual_name].rotation_quaternion = quat

Now I have red that my issue might be one of the following:

  • ordering of variables inside the quaternino (w,x,y,z vs. x,y,z,w etc.)
  • rotation is not defined by quaternion only but also by the parents and the location / position

Is there anywhere an easy to understand source available, where I could look up how to apply the right rotation to objects?

Thanks in advance!

You need to check the specs for your JSON data and see how it is treating the data. If you don’t have documentation for the format then you will have to reverse engineer the spec yourself. It is a good idea to make some simple test cases with a single bone in a known rotation so you can tell the ordering of the quat data. After than you have to make some more simple test cases with 2 bones parent together to see how the parent-child relationship works.

Another thing you don’t mention is “bone roll”. Your rotations might be corrupted if you don’t get the roll matched up between the two models.

Hm… I have hoped that this is a rather common “issue” and that there is somewhere some example how to handle this :-/.

Anyhow, thanks for the hint.

a good article about quaternions … http://joleanes.com/tutorials/flippingless/flippingless_02.php
generally speaking:
any object got a local matrix 4x4 , this matrix is consisted of position,scale,rotation (which can be a quaternion)

assume you have 2 joints “a parent and a child”
pseudo code:
parentMatrix = joint01.localMatrix #so this contains everything about this joint
parentLength = joint01.length #every bone got a length

childMatrix = joint02.localMatrix
childLength = joint02.length

here we can deduce the child global position
anyObjectGlobalMatrix = GlobalMatrix * anyObjectLocalMatrix

get child and parent global matrices
if you have the global matrix of parent , its orientation , then you can “translate” the position of parent to get the child position using a matrices work , which is something like Camera work
there is a matrix function “very well known called (lookAt) ,google it” , it simply takes upvector “0,1,0 for example if y is up” , parent position , child position (or target position) , and it will simply adjust the orientation of the parent like this!!

all what I said are just information to let you understand how matrices work
I know I’m talking about getting global position while the lookAt adjusts orientation for you , so simply if you have orientation and parent matrix , you can get child matrix , if you have parent position and child position , then you can get parent matrix “assuming it will point at child all the time”

hope this will help you :slight_smile:

@MohamedSkar So I get that I cannot just assign the quaternion-rotation I get out of the .json-file to my armatures / bones, as this is not taking the parents into account. So even if the numbers are the same, Blender takes the parent into account, i.e. same numbers does not mean same effect, right?

Where I am still a bit confused is which data I need from the parent. Is it enough to get the parent’s rotation data and do some magic, or do I have to take the whole matrix into account (including location)?

Somehow this turns out harder than I thought it would be :-/.

well this stuff is for OpenGL matrix 4x4 , but it is applicable on any Matrix out there
http://www.flipcode.com/documents/matrfaq.html

if you have the orientation of the parent , you need also the location of the parent + parent bone length (which may be not available)
for the length it is as simple as get the length of the vector between parent and child

I fear that I am not able to digest all the theory behind Quaternions at the moment and to create the code based on that.

Is there any example which I could take a look into? I have looked into the “io_anim_nuke_chan” Add-On which is nicely documented. Unfortunately it seems like the format is providing some information in addition to the quaternion which are used to “translate” the rotation.

Maybe there is also a library within Python where I can simply hand over the child and parent and it provides me with the translated rotation?

My biggest challenge with Blender / Python up to now as it seems.

If your rotation given in your file is in object space then you could convert to a matrix and assign to object matrix. I have not coded in over a year but when I did I had no success when setting rotation_quaternion. The cool thing about setting the matrix is that given it is in object space you do not have to worry about your relationships (parent, etc.) Blender takes care of all that under the hood.

Thanks for all the help. I finally was able to work it out. The solution looks like this…

Input are 2 sets of numbers:

  1. Location with 3 variables
  2. Rotation as Quaternion with 4 variables

First I am saving the location data into a vector. Afterwards I am translating the vector into a 4x4 Matrix:

Matrix.Translation(vector)

Second I am saving the rotation similar to the location data, just passing by one additional value and translate it also to a 4x4 Matrix:

quat_quaternion = Quaternion(vector)
quat_matrix = quat_quaternion.to_matrix()
quat_matrix.resize_4x4()

Finally I adjust the “matrix_world” of an active object by the result of a multiplication (location x rotation):

bpy.context.object.matrix_world = vec_matrix * quat_matrix