Restarting the Game decrease in performance

In my game I have logic bricks attached to a button to restart the game after the player looses. I noticed that the frame rate drops when I restart the game. It is as though the memory from the previous play has not been refreshed. Is there something I can do with python or logic bricks to restart the game and refresh the memory so that I don’t loose any performance? Thanks.

I can’t say I’ve ever experienced this before. Could you provide a blend file that has this problem? If the file you’re working with is fairly complicated, I would suggest systematically removing objects and logic until the file is stripped down to a bare minimum while still demonstrating this behavior. This will make it much easier to identify the cause of the problem.

Hi Mobious I have decided to try loading the Scenes from separate blend files, however now I am getting very strange results. I am receiving errors that are not even in current versions of my code. I made the separate files from copies of the original file. Is it possible that the old code is still stored, even though I unlinked them, and being referenced instead of the new code. The errors I am getting say that there is an error in a particular method on a line that the method does not even exist on anymore since the line range of the method has been altered due to my editing.

I cannot think of a simpler concept for a game. If I can’t get this to work I don’t see much future for me and the BGE (just kidding). If you have a chance maybe you can help me address certain issues such as:

  1. The game not functioning when you return to level1 after loosing

  2. severe frame rate drop (at about score =50)

  3. The exported run time crashing when you click the ‘start’ button

project folder :i removed this link :slight_smile:

Your performance problems may be due to V-Sync. I have found V-Sync to be broken in the past. With vsync on, I get 37fps, off I get minimum 60 in your game. It’s actually a useless function because bge’s logic ties to its renderer, so it stays at 60fps anyway.

It doesn’t work the second time because of the globalDict value “score” (because you separated the scenes).

Thanks MrPutuLips. I disabled Vsync for all files and I found that I had some code lingering in the intro file. Now The game works correctly, still with severe lag after a while, with the embedded player and standalone player. But when I export the game it still crashes when you press start.

After Activating Frame rate and Profile I noticed that the performance suffers when under ‘Profile’, ‘Rasterizer’ shoots up to like 90-100 %. I don’t see how this could be. I don’t see how this could happen. I keep objects like enemies and bullets in lists that remove them if they die or are not seen.

Do you mean in the one you posted in this thread?
levels\intro\intro_v01.blend, Line 174, GameScene.py:
numScore = logic.globalDict[‘score’]

What? BGE’s logic is tied to the renderer, but you can still want Vsync on or off.

What I mean is that Vsync isn’t doing anything when the fps is locked to 60 already (vsync limits the amount of frames rendered per second to the monitor’s refresh rate). If it worked, then it would set itself to 75fps on my monitor, because that is my refresh rate, but it doesn’t. Even with an old 50Hz monitor, it still locks to 60 (the logic ticRate).

It is, however relevant if you have “Use Frame Rate” off (which sets the framerate to the logic ticRate).

EDIT: Adaptive Vsync just sets the framerate to 30 or 60, depending on how fast your system can render. So if you’re getting 55fps, it might set it to 30, which seems like an irrelevant feature, and it doesn’t turn VSync off to allow fps above the ticRate.

  1. VSync isn’t to lock the FPS down or something like that, it’s to prevent screen tearing by waiting for the screen to refresh before drawing the next frame of the game.

  2. What you described isn’t adaptive VSync, but rather normal Vsync. When the framerate can’t hit the target (60), then the game waits for the next refresh on your screen to continue, thereby effectively halving the framerate (30). Looking at that link above listed Adaptive Vsync as an option NVIDIA offers to dynamically change the refresh rate a bit to handle fluctuations while still preventing screen tearing. The Wikipedia article doesn’t mention anything about ATI cards, so it may not be available to them.

Screen tearing is caused by either runt frames, or 2 (or more) frames entering the same frame buffer. Therefore, limiting the framerate to 60 only allows 60 frames per 60 updates (1 frame per buffer) – a hard limit on the framerate. Reducing the framerate to 30 is to “smooth” the framerate at half, therefore having 2 buffers per frame, which still has no tearing because there is only 1 frame per buffer (in my opinion a terrible idea, because it could be any multiple of 2 by that logic, not just 30fps, though that would create stutter :/).

NVidia’s adaptive Vsync turns Vsync on if the framerate is above 60, off if it is below 60, and to 30 if the framerate is ~30fps. Blender’s Adaptive Vsync only has 2 of the 3: 60 or 30, meaning that Vsync is basically always on if your logic’s ticRate is set to 60. Only applies if “use frame rate” is enabled. If not, then it makes perfect sense.

What you are referring to is G-Sync, which is an entirely different. What that does is set the monitor’s buffer/second to the same as what the graphics card is processing (up to 144Hz), therefore setting 1 frame per buffer at any frame rate up to 144Hz.

My reasoning for saying Vsync is useless in this situation is because the logic is calculated relative to the framerate. So if someone has 60fps, and another 30fps, the person with 60fps would be able to do everything twice as fast (I am not talking about reaction times), because your ticRate is doubled. That’s why nearly every game engine has an independent logic rate.

I played around with your files for a bit and I got a consistent 24-30 FPS. I didn’t see the accumulating lag that happens when the game is restarted. I have a GeForce 9800 GT, it is a couple years old now but it still has no problems with games like “Portal 2” so I would expect it to be able to handle your scene fairly easily.

The biggest performance drain that I had on my system was the number of lights. You currently have 5 lights in the scene and when I deleted 2 (so there were 3 total) my FPS went up to a steady 60 FPS. In GLSL mode, each additional light adds linearly to the amount of work that the fragment shaders have to do. i.e. 5 lights takes 5 times longer to draw than 1 light takes.

Other things I noted off the bat is that a few of your items have modifiers on them. The BGE seems to work best when modifiers are applied to the system. Another note is that your ground is using the same texture for color and normal mapping which seemed a little odd.

Thanks guys. deleted all my scripts and logic bricks and rebuild them. I joined all the scenes back into one file. Whatever was causing the Rasterizer Profile to shoot up, its not doing it anymore. The only problem I was having was that when I die and try to return to the GameScene from the IntroScene again I get an error :

SystemError: Blender Game Engine data has been freed, cannot use this python variable. 


for some global variables.

I solved this by having no global variables. Thanks.