Applying rotation from kinect to blender


I hope I’m writting at the right place this time.
I’m doing a realtime motion capture with kinect v2 and blender. I managed to send rotation data in quaternions from kinect application to blender, but my 3D model in blender gets totaly wracked.
It seems I must do something with those quaternions before applying them to bones.
Any idea?


It might be because of the rig. Would you mind uploading the rig to show us?

I appreciate your help. Here it is.

My C# program sends rotation data. But blender script on the link recieves mostly Spinebase and SpineMid bones and very rare some other bones. When I test with other pyosc script it recieves a lot of different bones.