With my D3D wrapper I display the framerate using a Simple Moving Average (SMA) calculated over the last hundred of frames. I'd be totally happy with it, if it weren't for a minor problem that I want to get rid of, but have no clue about how to.
THE FIRST PROBLEM:
So long a game is constant in performance, and produces its frames in pretty much the same timeframe, and maybe drops the occasional frame now and then, the SMA will display a veritable and meaningful fps information. But when a game is inconstant, produces frames the build time of can vary greatly, maybe even causing the frequent drop of frames, the SMA starts displaying funny values that just tell lies.
A few examples will illustrate the problem better than my english can.
For brevity's sake, imagine a scenario in which the frame history for the SMA is of 3 frames. And we count 900 timer ticks in 1 second. Also, normally a frame can compose in 50 ticks, but the game wants to limit itself to 3 frames per second, meaning that between composition and displaying, 300 ticks elapse from frame to frame.
In the perfect scenario we have all 3 frames to compose and display in 300 ticks each.
The Simple Moving Average (SMA) over the last 3 frames is going to be a perfect 3.0:
sma = (1 / 300) + (1 / 300) + (1 / 300) sma = sma * 900_ticks_per_second sma = sma / 3_frames_history sma = 3.0 fps
In case a frame takes too much, the next frame may be dropped. Suppose that the 2nd frame takes 600 Ticks, leaving no room for a 3rd frame, which gets dropped. What will the SMA say?
sma = (1 / 300) + (1 / 600) + <nothing: this frame dropped> --( ^^ no! because a frame dropped, the actual 3-frames window will be: )-- sma = (1 / 300) + (1 / 300) + (1 / 600) --( hence: )-- sma = sma * 900_ticks_per_second sma = sma / 3_frames_history sma = 2.5 fps
Now suppose 1 of the 3 frames composes in 500 ticks. It's late, but it's possible to compensate. Normally a frame builds in 50 Ticks, and in fact the next one is on screen 100 ticks later. With all 3 frames within the 900 ticks limit, the fps of 3.0 is respected. However, the SMA will tell differently:
sma = (1 / 300) + (1 / 500) + (1 / 100) sma = sma * 900_ticks_per_second sma = sma / 3_frames_history sma = 4.6 fps
It gets worse if we continue from there. Suppose that the next 2 frames are regular and compose & display in 300 ticks each.
New SMA after 1st regular frame:
sma = (1 / 500) + (1 / 100) + (1 / 300) sma = sma * 900_ticks_per_second sma = sma / 3_frames_history sma = 4.6 fps
New SMA after 2nd regular frame:
sma = (1 / 100) + (1 / 300) + (1 / 300) sma = sma * 900_ticks_per_second sma = sma / 3_frames_history sma = 5.0 fpsSee the problem?
Add another regular frame (300 Ticks) and the SMA suddently drops to perfect 3.0 fps.
In presence of inconstant frame times it's annoying to see the SMA raising past the hard limit being imposed by the very game. Not only it gives a bad impression, it may also mask the true loss in fps that occurs when a frame is willingly dropped. I tried with different Moving Average formulae, taken from Wikipedia, but the problem persists. And in the end the Simple Moving Average is the most responsive to changes and the fastest to calculate.
THE SECOND PROBLEM:
If I'm allowed to briefly mention a commercial game, I find Far Cry 2 to be a good test platform for my wrapper. Its performance is most constant on my hardware (which isn't next-gen anymore) and gives me opportunity to test the robustness of my fps display. Also, the game features both an fps display calculated with SMA, and a framerate limiter.
Now, when letting the game run unconstrained, both its fps display and my wrapper's fps display show the very same readings all the time (provided that I lower my frame history to 64 entries, and make the final rounding to the 1st decimal digit).
What I haven't said is that my wrapper features a framerate limiter as well.
If I activate my wrapper's framerate limiter, both the wrapper's fps display and the game's fps display will show the very same values. No surprise. This also happens when the occasional frame is to be dropped, and the SMA jumps past the imposed limit because of it.
But let's invert the roles...
If I activate the game's framerate limiter, this time my wrapper's fps dsplay will report a slightly greater fps average (say, 40.1 or 40.2 fps) than that of the game's fps display which instead reports a perfect score (say 40.0 fps). The surprise, however, is that when frames are dropped or are just late, the game's fps display will _drop_, while my wrapper's SMA will jump past the real value - as usual.
The situation changed when in the nVidia control panel I forced the Maximum pre-rendered frames setting to 0. Now when using the game's framerate limiter, the game's fps display would dutifully declare a perfect fps amount (say 40.0 fps) but not before having shown massive initial imprecisions. Looks like it adjusted itself dynamically. My wrapper's fps display instead would calculate the same imperfect fps average of before (say, 42.6 fps). How can it be?
For all I observe the game to guess what it's doing -and how-, I come up with no explanation.
If anybody could shed some light I'd really appreciate it.
Thank you very much.
Edited by Nautilus, Today, 01:04 AM.