Tracking object between scenes

Hi all

I am trying to make a simple HUD item (currently a cube) to track the world position of an object in the ‘main’ scene of my game.

Currently I get the world position and use the x and y components to try and duplicate the objects movement in the HUD scene using this data. However when I try the position is wrong.

My question: how do I make any HUD objects origin match exactly with another object in another scene?



But are you sure the POSITION is wrong?
If both coordinate systems used are the same then that’s not really possible.
I would ask if your hud camera moves accordingly to your main scene camera?
Or if it is even in the same start position?
The coordinates might be right but your two perspectives are different then.
Otherwise maybe a example blend would help.

Here is the blend- its based on an example that SolarLune posted a while ago. Just press P. The main scene camera is used as the orange cube jumps outside the cameras view.


tag.blend (391 KB)

Here’s an altered version. Note a couple of things.

  1. You have to view the main game scene from the camera view, since the default Blender 3D modeling camera wouldn’t usually have the same proportions as the game engine. The proportions of the game view is important, of course, to the tag, since the overlay scene is viewed from the overlay camera.

  2. For this method of having a tag on an object, I would position the overlay camera such that the bottom-left corner is point [0, 0], and then have the formula for setting the position to be:

own.worldPosition.x = tpos[0] * 4.0
own.worldPosition.y = (1 - tpos[1]) * 2.1

This way, you don’t have to do any extra calculations to offset the camera since it’s position is in the middle of the world.

  1. I factored in the cube’s linear velocity divided by the game’s logic tic rate in an attempt to shorten the lag between the cube’s position and the tag’s position. This helps to display the tag where the cube should be this frame, rather than where it was last frame (assuming that Python works in that order). However, the lag between the two is still noticeable when the cube starts moving, so you’ll have to look into it a bit more if you want that to be not noticeable.

Example blend: tag.blend (488 KB)

EDIT: BGUI would help out, since you wouldn’t have to set up an overlay scene or have an overlay camera - it would just be you grabbing the screen position of the object, and then passing that to the tag’s image position, with a slight fix to make it work with OpenGL’s drawing system.

Thanks SolarLune, your tweaks are perfect for what I want! I will have to look a bit more at BGUI, it does seem a powerful piece of code (ever thought about fitting a BGUI tutorial in the spare seconds you have not making Valchion?).