Good afternoon guys, I have an animation in PyOpenGL, which receives the coordinates of a file, and then animates an ROV.
This animation calculates the homogeneous ROV matrix containing its position and Orientation in certain instants of animation, (Ex, 10 seconds of animation are 2000 lines of coordinates “X Y Z”).
I’m bringing this simulation to Blender, for working on it creating a more real world for animation.
But I’m having problems.
I have a CAD Object that is an ROV, and when I step at all instants of the animation this Homogeneous Matrix, the position of the ROV is correct, but the orientation of it goes wrong, inside the Blender.
It seems that Phi, Theta and Csi are reversed. The orientation should go to the left, but it appears going right. How can I check this and change the orientation of the object, which is being passed by the homogeneous Matrix:
Would anyone help me understand why this occurs?
Blender work with Matrix Homogeneous different from OpenGL?
Haven’t got time to look properly at the moment, but it’s probably something simple like row major vs col major differences. Can’t recall which one OpenGL is.
I found out what’s wrong, in Blender I have to Z-UP, but I need Z to be DOWN.
I tried to change my object by enabling the “Inverted Z depth” option on the property, but I did not succeed, nothing has changed.
I tried to export my CAD object as “FBX” and put the “UP” “-Z UP” and “Z UP” option, and import it into blender, but also did not change the animation. It was the same, inverted.
How do I invert the Blender to Z = Down, or invert the Z?
Then Smoking_mirror, imagine that the ROV is in the Sea, here in Brazil the level of the city is based on the level of the Sea, so the rov navigating the bottom of the ocean would be below the “natural” level, so I work with Z being negative, If I want Rov to go down deeper, it’s going in the negative direction, if it goes up, it’s going in the Positive direction, so I’m using Z as negative in this case.
Ah! So this is the reverse of the usual situation?
Usually people model and animate in blender and then export to another system.
Here you have another system and you’re importing the data to blender?
If you have the animation data as a file you could write a python script to go through and convert it.
Invert z depth is a rendering option, it’s the z-depth buffer (how far away an object is from the camera, what renders in front and behind etc…) it won’t help here.
BTW, is this for an interactive simulation (using the Blender game engine) or a rendered animation (using blender render or cycles?).
Exactly this Smoking_mirror, I have a simulation done in Matlab and I import via python the file “XYZ” coordinates, and I already did this to negatively value the Z, however it happens as the image above the main post. Inverting the value of Z I even have the correct position of the object, but the orientation does not. The orientation continues as if it were Z-UP.
I’m not really sure of the actual data you have, and then what you’re doing with it.
Perhaps all axes are inverted, in which case a simple orientation matrix inverse would suffice. See the decompose function (Matrix.decompose) to decompose the 4D matrix into scale ori trans.
Otherwise, you can deconstruct the orientation matrix into the axis vectors, and then rebuild those into the appropriate format (this is the same as just transforming your matrix, but the manual way).
More information would help.
On the top of my head, if you’ve just defined a new coordinate system with Z downwards, then you have one of two possible scenarios:
Just the Z is negated from typical XYZ. In this case, you’d just want to negate the Z orientation component.
The Z and one of XY is negated, which requires a rotation of 180 about the unmodified axis
Thanks for help agoose77 I got it from OpenGL and brought it to the blender, but I built my underwater world with Z down, so only Z is inverted. If I run the animation with Z-UP it works, but in my case, the ROV moves in the sky.
Yes, but where is that data coming from? And what type of data is it? What CS is it in etc…
Anyway, you can solve this yourself by working in a Z-up CS, and then inverting the Z coord of the transform component, or spend some time to identify the CS your data is in, and work out the transforms to move to the blender CS.
But I do this, I reverse the Z before calculating the directions, but even so it gets inverted.
Is there not some way to invert Z in Blender in this way?
I need some way to change the Axis of an object or every Blender Game scenario, where Z is upside down.
It seems your data are provided in different coordinates (left hand). As agoose77 already suggested a good way is to convert the input data into the coordinate system that is used by the BGE.
This shouldn’t be a big deal. The input data is provided in a Cartesian coordinate system as well. You should be able to create a transformation matrix that does all the conversion with a single matrix multiplication (left-to-right, rotation, scale, offset etc.). Then you multiply any input data set with the conversion matrix to get BGE coordinates.
Here is a demo. It takes the vertices colored arrows as input. As your input uses a left-hand coordinate system the converted positions should be up-site down (mirrored at the X/Y plane).
To demonstrate that a colored sphere is placed at the converted positions.
At a 1:1 conversion the spheres would match the arrows.
The converted positions would only match the red and green parts (which are mirrored too, but this is not visible).