-
Notifications
You must be signed in to change notification settings - Fork 2.2k
Game: Smoothen FPS based on dtime #16688
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
This makes the FPS display in the debug information more pleasant to look at, especially at very low dtime values, e.g. when disabling vsync.
The "issue" is that we're averaging the client loop time (dtime), which is initially 0.0001 on my system due to I could introduce a boolean to ignore the first iteration and use the subsequent one as initial value for |
|
As far as I know, the smoothness which the player perceives depends more on the recent maximum dtime value than the average dtime. This could be implemented with the following calculations:
And if showing the recent minimum inverse dtime value is a bad approach, too, would it nonetheless be better to average an exponentiated dtime value instead of the actual dtime, i.e. something like |
Perhaps. But with this change, high dtime values are weighted higher, thus the FPS counter should honor those better than before.
Why? Is there any research paper on that? What is the underlying mathematical reasoning for using exponents? |
|
Maybe we're better off just collecting the last 200 dtimes and doing trivial statistics on top, instead of something clever. |
|
@sfan5 Unfortunately collecting samples to display a median (or 99% percentile) also depends on the FPS. The filter needs special handling to output a reasonable value in time, and that anywhere in the range of 30 FPS to 1000 FPS. Thus, I doubt it gets any easier than the change I proposed. |
I couldn't find much profound information about how to calculate an FPS or similar value, so my comments are only based on my assumptions. Maybe my comments are offtopic. If "Frames per second" literally refers to the number of frames which are rendered divided by the time of the measurement, maybe the inverse of the maximum or median dtime should not be called FPS. |

This makes the FPS display in the debug information more pleasant to look at, especially at very low dtime values, e.g. when disabling vsync.
To do
This PR is Ready for Review.
How to test
pause_fps_max = 10, open the pause menu and focus another window -> the FPS counter must go down in reasonable timevsync = falseand regular gameplay -> the FPS counter goes up and shows a rather readable average value, at the cost of increased jitter (perhaps calculating 1% lows would be more appropriate?)