I’m working on an realtime environment in which I have one camera to move around. In front of that camera is something like a 3d-cursor, which is parented to the camera, but can move independent from it as well. So if I move the camera that “cursor” moves too. If I use some other keys, the “cursor” moves independently from the camera. So far this all works.
If I start the realtime environment and just move the “cursor”, it behaves like I defined movement in logic bricks. If I move a bit around with my camera, always having in front of it the cursor, and then start moving the cursor again it won’t behave like expected.
If camera isn’t moved, and I move the cursor left or right, it really goes exactly left or right.
If camera was moved or rotated, the movement of the cursor doesn’t go exactly left or right. Probably it depends on the movement or position of the camera, how the cursor will really move.
The camera is parented to an empty, that has the logic for movement. The cursor ist parented to another empty, that as the logic for cursor-movement. The cursor-empty is parented to the camera empty. All movement is based on local axes. I looked already that the axes of all objects have the same justification. But the axes are different from the absolute coordinates of blender.
My further logic has to be based on the cursor, so it is important that it will behave like needed.
Thank-you for any help.