Hi
Is the logic tic rate constant, even if the fps changes dramaticly?
Eg can delay be used as a timer accurately enough, set the delay sensor to 60 for something to happen every second?
Or if something happens everytime a (small)script goes around, will it be consistent?
I know you can use the timer property but I dont want to have lots of them
Cheers
I believe the logic tic rate is tied directly to the framerate- if you cap your fps at 60, and it stays constant, you should be fine- but if you get lag, or your framerate varies between about 59.9 and 60.1 fps as often tends to happen, then your timer might not sync with real time. You can, however, access the system clock using (if I’m not mistaken) the time module. If you set a global variable to sync to the time module, then you can have whatever action you want trigger every second by checking against that.
I could be wrong though, I’d recommend experimenting (try running a game that lags to like 10 fps and see if your timer syncs; run a file that goes fast but cap your framerate, and see if it syncs, etc)
Ah yes your right, I was just doing the same test.
Although on my test the ‘60tic timer’ eventually became slightly inaccurate to the proper timer property by a second or so, but different framerates didnt effect that. It was consistent.
Thanks guys, now I can happily use delay and frequencies.
There is no guaranty for logic tick rate.
When the frame rate drops the logic tick rate might drop as well even when trying to keep the same rate.
You should not test with 120FPS take an old computer and run a complex scene so you have heavy renering, heavy physics and heavy logic (all three factors).
If you need the 1 second realtime I advice to use timer properties or the python timer as Captian Oblivion suggested.