Homogeneous Matrix Wrong Orientation from Blender, come from PyOpenGL

Good afternoon guys, I have an animation in PyOpenGL, which receives the coordinates of a file, and then animates an ROV.

This animation calculates the homogeneous ROV matrix containing its position and Orientation in certain instants of animation, (Ex, 10 seconds of animation are 2000 lines of coordinates “X Y Z”).

I’m bringing this simulation to Blender, for working on it creating a more real world for animation.

But I’m having problems.

I have a CAD Object that is an ROV, and when I step at all instants of the animation this Homogeneous Matrix, the position of the ROV is correct, but the orientation of it goes wrong, inside the Blender.

It seems that Phi, Theta and Csi are reversed. The orientation should go to the left, but it appears going right. How can I check this and change the orientation of the object, which is being passed by the homogeneous Matrix:

Would anyone help me understand why this occurs?

Blender work with Matrix Homogeneous different from OpenGL?

Codes: Sample Matrix homogeneous:


posture =
[ 0.21468954 -0.86920989 -0.4454017  -1.61063969 
  0.65986419 -0.20712563  0.72227293  2.18389297 
 -0.72006083 -0.44896907  0.52909279 -7.34360504 
  0.  0.  0.  1.]


self.owner.localTransform = posture


Rov should be in the right orientation, with his forehead elevated. He is inverted and with the front pointing down.

Thanks for helping.

Haven’t got time to look properly at the moment, but it’s probably something simple like row major vs col major differences. Can’t recall which one OpenGL is.

Thanks for the answer, would you know what these differences are? I have searched, but I have not found anything that solves this question.

I found out what’s wrong, in Blender I have to Z-UP, but I need Z to be DOWN.
I tried to change my object by enabling the “Inverted Z depth” option on the property, but I did not succeed, nothing has changed.
I tried to export my CAD object as “FBX” and put the “UP” “-Z UP” and “Z UP” option, and import it into blender, but also did not change the animation. It was the same, inverted.
How do I invert the Blender to Z = Down, or invert the Z?

Just model and animate your assets upside down. Now z will be down.

I’ve never understood why some developers of engines or systems use z as down, or y as the vertical axis. It just seems bizarre to me.

Then Smoking_mirror, imagine that the ROV is in the Sea, here in Brazil the level of the city is based on the level of the Sea, so the rov navigating the bottom of the ocean would be below the “natural” level, so I work with Z being negative, If I want Rov to go down deeper, it’s going in the negative direction, if it goes up, it’s going in the Positive direction, so I’m using Z as negative in this case.

Ah! So this is the reverse of the usual situation?
Usually people model and animate in blender and then export to another system.
Here you have another system and you’re importing the data to blender?

If you have the animation data as a file you could write a python script to go through and convert it.

Invert z depth is a rendering option, it’s the z-depth buffer (how far away an object is from the camera, what renders in front and behind etc…) it won’t help here.

BTW, is this for an interactive simulation (using the Blender game engine) or a rendered animation (using blender render or cycles?).

Exactly this Smoking_mirror, I have a simulation done in Matlab and I import via python the file “XYZ” coordinates, and I already did this to negatively value the Z, however it happens as the image above the main post. Inverting the value of Z I even have the correct position of the object, but the orientation does not. The orientation continues as if it were Z-UP.

I’m not sure what would help. You could try digging around in the mathutilsfunctions, there should be something you can use there.

I’m not really sure of the actual data you have, and then what you’re doing with it.
Perhaps all axes are inverted, in which case a simple orientation matrix inverse would suffice. See the decompose function (Matrix.decompose) to decompose the 4D matrix into scale ori trans.

Otherwise, you can deconstruct the orientation matrix into the axis vectors, and then rebuild those into the appropriate format (this is the same as just transforming your matrix, but the manual way).

More information would help.

On the top of my head, if you’ve just defined a new coordinate system with Z downwards, then you have one of two possible scenarios:

  1. Just the Z is negated from typical XYZ. In this case, you’d just want to negate the Z orientation component.
  2. The Z and one of XY is negated, which requires a rotation of 180 about the unmodified axis

Thanks for help agoose77 I got it from OpenGL and brought it to the blender, but I built my underwater world with Z down, so only Z is inverted. If I run the animation with Z-UP it works, but in my case, the ROV moves in the sky.

Yes, but where is that data coming from? And what type of data is it? What CS is it in etc…

Anyway, you can solve this yourself by working in a Z-up CS, and then inverting the Z coord of the transform component, or spend some time to identify the CS your data is in, and work out the transforms to move to the blender CS.

Example of reading the data:

0.0000000e+00 0.0000000e+00 0.0000000e+00
-8.2547766e-07 2.1255807e-05 -7.3601156e-05
8.0801344e-07 7.1877283e-05 -2.7640549e-04
1.0035076e-05 1.3381316e-04 -5.8171909e-04
6.4424821e-05 2.3593138e-04 -1.4065172e-03
1.1271831e-04 2.5758911e-04 -1.8896511e-03

7.3655910e-02 1.2536462e-01 4.2374276e-01

But I do this, I reverse the Z before calculating the directions, but even so it gets inverted.
Is there not some way to invert Z in Blender in this way?

I need some way to change the Axis of an object or every Blender Game scenario, where Z is upside down.

How about turn the camera upside down? Then you can have the simulation play out as normal…

It’s an idea, I would have to invert my whole world, a little work, but it is possible.
Can I reverse Z gravity? -9.8?

Yes, you can set gravity in the world panel.

I’m sorry, but you still haven’t explained - what is that data? Just position coordinates?
If so, how are you calculating orientation?

Your orientation matrix is odd, it’s not just a rotation, have you just flipped Z?

I suggest upload a blend file with the data.

The BGE (and Blender) use right-hand Cartesian coordinate system.

It seems your data are provided in different coordinates (left hand). As agoose77 already suggested a good way is to convert the input data into the coordinate system that is used by the BGE.

This shouldn’t be a big deal. The input data is provided in a Cartesian coordinate system as well. You should be able to create a transformation matrix that does all the conversion with a single matrix multiplication (left-to-right, rotation, scale, offset etc.). Then you multiply any input data set with the conversion matrix to get BGE coordinates.

Thanks for help.
Do you know if Blender uses the column-oriented array concept?

The BGE uses matrices. You can see them as 2d arrays. The access is always via 2 coordinates -> f(x,y).

It does not matter if you define a matrix column or row wise. This is more a question of readability.

The conversion matrix for your situation is


AXIS_Z = [0,0,1]


# scale by -1 at z-axis (with a 4x4 matrix)
conversionMatrix = mathutils.Matrix.Scale(-1, 4, AXIS_Z)

You can convert your input data into the BGE coordinate system (scene space) via:


worldPosition = mathutils.Vector(inputCoordinates) * conversionMatrix 

Here is a demo. It takes the vertices colored arrows as input. As your input uses a left-hand coordinate system the converted positions should be up-site down (mirrored at the X/Y plane).

To demonstrate that a colored sphere is placed at the converted positions.

At a 1:1 conversion the spheres would match the arrows.

The converted positions would only match the red and green parts (which are mirrored too, but this is not visible).

Attachments

ConvertInputCoordinates.blend (456 KB)