I am trying to import animation data (position and rotation) from a .json-file for an object with different armatures / bones. In general it works fine and the position is red right (as far as I can see). The rotation is provided as Quaternions. In general it also looks like they are applied right… but with some armaturs (I have a humanoid object, so I see this especially in the fingers), the rotation is messed up and I have a weird result (even location might be a bit wrong).
I am wondering now what I do wrong and where I have to optimize the code. What I do is straightforward. I read a line of data from the .json file like this and assign it to a quaternion:
You need to check the specs for your JSON data and see how it is treating the data. If you don’t have documentation for the format then you will have to reverse engineer the spec yourself. It is a good idea to make some simple test cases with a single bone in a known rotation so you can tell the ordering of the quat data. After than you have to make some more simple test cases with 2 bones parent together to see how the parent-child relationship works.
Another thing you don’t mention is “bone roll”. Your rotations might be corrupted if you don’t get the roll matched up between the two models.
assume you have 2 joints “a parent and a child”
pseudo code:
parentMatrix = joint01.localMatrix #so this contains everything about this joint
parentLength = joint01.length #every bone got a length
here we can deduce the child global position
anyObjectGlobalMatrix = GlobalMatrix * anyObjectLocalMatrix
get child and parent global matrices
if you have the global matrix of parent , its orientation , then you can “translate” the position of parent to get the child position using a matrices work , which is something like Camera work
there is a matrix function “very well known called (lookAt) ,google it” , it simply takes upvector “0,1,0 for example if y is up” , parent position , child position (or target position) , and it will simply adjust the orientation of the parent like this!!
all what I said are just information to let you understand how matrices work
I know I’m talking about getting global position while the lookAt adjusts orientation for you , so simply if you have orientation and parent matrix , you can get child matrix , if you have parent position and child position , then you can get parent matrix “assuming it will point at child all the time”
@MohamedSkar So I get that I cannot just assign the quaternion-rotation I get out of the .json-file to my armatures / bones, as this is not taking the parents into account. So even if the numbers are the same, Blender takes the parent into account, i.e. same numbers does not mean same effect, right?
Where I am still a bit confused is which data I need from the parent. Is it enough to get the parent’s rotation data and do some magic, or do I have to take the whole matrix into account (including location)?
Somehow this turns out harder than I thought it would be :-/.
if you have the orientation of the parent , you need also the location of the parent + parent bone length (which may be not available)
for the length it is as simple as get the length of the vector between parent and child
I fear that I am not able to digest all the theory behind Quaternions at the moment and to create the code based on that.
Is there any example which I could take a look into? I have looked into the “io_anim_nuke_chan” Add-On which is nicely documented. Unfortunately it seems like the format is providing some information in addition to the quaternion which are used to “translate” the rotation.
Maybe there is also a library within Python where I can simply hand over the child and parent and it provides me with the translated rotation?
My biggest challenge with Blender / Python up to now as it seems.
If your rotation given in your file is in object space then you could convert to a matrix and assign to object matrix. I have not coded in over a year but when I did I had no success when setting rotation_quaternion. The cool thing about setting the matrix is that given it is in object space you do not have to worry about your relationships (parent, etc.) Blender takes care of all that under the hood.