Render.DrawLine once for all

Hi,
I tried to render lines, but they seem to need to be rendered every frame! Is that correct?
I understand they are not meshes, so they quickly disappear.
What I want is to have them stay as if meshes!
Can it be done?


Edit:
I guess it can’t! There’s no place to save their data, my guess is that they are screen space objects…

You are correct. The call to render a line draws the line for that frame alone. As soon as the next frame draws, the line will be lost.

However, if you want to keep them around then just an an Empty to your scene with an Always sensor and Python controller. That will give your script a chance to run each frame and you can just draw your line at that point.

You can have a mesh with two verticies,
And add them in with add object, and then manipulate the verts,

If you were not using many of them I would suggest using a armature,
and setting stretch to 1.0

So the armature is 1 point in space, and its empty target is the other end,
This allows you to parent the empty or armature etc to moving objects.

Wire materials work in the game engine?

Create a draw callback, which can then run every frame without requiring a python controller

Sent from my Nexus 5 using Tapatalk

Can you explain that a little more? I don’t know the phrase callback. Do you mean like a function manager run from a separate object?

A callback is an object that is passed to another system as a means of communication. Like sending a mobile phone to someone in the post. It is usually a function (though it could be a class instance) which is executed when an event occurs.

Sent from my Nexus 5 using Tapatalk

Better to directly create opengl primitive shapes, as drawline (may) call a series of draw steps.

Sent from my Nexus 5 using Tapatalk

You must be aware that my level of programming isn’t that advanced yet, otherwise, all spacey objects like stars, nebulae and dust would have been opengl objects.
For now it works fine, I managed to render 5000 stars around a nebula at acceptable frames.



If you have any examples that can quickly generate shapes at will, be it dots or halos, pliz point me up, as i do my little search at the same time!

So how is it done in blender? I’ve been working on something that sounds like that.
I have a manager object which runs a script on pulse mode every tic, or every couple of tics. I can put instructions, like play a sound, or add an object in to a gamelogic dict and the manager will pick it up on the next tic and execute it. That means only one object in the game is running at a high logic tic rate while all the others can send requests to the manager for such things as spawning smoke at their position for 100 tics with a single execution, or playing a sound even after they have died.

Is it the same thing, or are you talking about something built in to python I can use?

Generally stuff like this is handled by a simple skybox:


If you’re basing the background on actual distant objects that need to move around as the player moves, you could try periodically rendering the background to a texture and use that to update the skybox.

Also, I don’t know if you’ll find this helpful or not, but you may be interested in this. It’s where I found that skybox texture.

I guess you could look at procedual generated skyboxes. But otherwise it might look boring to have the same skybox in every system.

when I made a space sim I used two starfiend textures on my skybox, one low rez and one high res and mixed them with a noise texture. The small and big stars mixed together pleasingly so as not to appear tiled. You could use vertex colors for mixing and so as to be able to use dynamically procedurally generated noise. You could also use multiple textures, multiple skyboxes, or othrr tricks to make each star system different.

http://alexcpeterson.com/spacescape

What I used for one of my old projects, amazing piece of software. Worth following the tutorials.

Thank you for the suggestions guys!
I could use skyboxes at the begginning of the game, before jump gates or jump drives are available. But otherwise, Nebulae will be active objects and can influence space travel, they are volumetric too, to enhance immersion. For me is easier to have it 3D than having to render it everytime a ship moves around them.
I still think is a good idea, and I might have to use it. So far I’m enjoying them in 3D, rendering them could allow me to make them look better without the rendering/scenegraph cost!

Will test it out!

What about LOD that has a empty mesh, an alpha sprite,
And then a high res sprite as you get closer?

LOD is obviously needed! But the others have a point! Consider this:

In my space scene, I have about 8000 planes and 3 textures, if no planet is rendered, to display galaxy dust, stars and eventually a nebula. With a skybox all these can be skipped and I can use their memory and rasterizer space to add asteroids, clouds , dust and other planetary props with which the player interacts.
My tests reveal that the BGE can handle rendering all of them background and foreground objects that is. But I never got the entire system together, that is, with networked ships, animated avatars and NPC along with buildings displaying different textures, plus gravity!

Reducing the stress stars make on the scenegraph is a great step for me. Adding a skybox could help, since I could generate the galaxy, capture it from the player’s POV and place the texture in the SB, then delete the sprites!

@blueprint:
The scale of Torakunsama’s system means that simple solutions like that are not efficient enough. Even Kupoman’s LOD in recent builds probably isn’t fast enough.