Cross-Platform Accurate Timers

979895261e6c52a6d2d5313d6a55cd79
0
zenogais 101 May 08, 2004 at 02:54

The following code is for a cross platform accurate timer. This timer also has the ability to calculate FPS if you so desire. For windows you can use either SDL_GetTicks() for a timer or QueryPerformanceCounter(), this of course requires SDL to be used.

Timer.h

#ifndef TIMERS_H
#define TIMERS_H

#if defined(_MSC_VER) || defined(_BORLANDC_)
#define u64 __int64
#define WINDOWS
#elif defined(__LINUX__) || defined(__MACOSX__)
#define u64 unsigned long long
#define LINUX
#elif defined(__MINGW32__) 
#define u64 unsigned long long
#define WINDOWS
#endif

class CTimer {
public:
    CTimer();
    CTimer(unsigned int type);
    CTimer(CTimer& oldTimer);

    ~CTimer() { }

    void SetTimerType(unsigned int newType);
    unsigned int GetTimerType();

    void SetUpdateInterval(float newInterval);
    float GetUpdateInterval();

    void Reset();

    float GetFPS();
    float GetTime();
private:
    void InitTimer();

    float timeAtStart, lastUpdate, updateInterval, fps;
    u64  ticksPerSecond;
    unsigned int timerType, frames;
};

enum Timers {
    Z_TIMER_SDL,    //Uses SDL_GetTicks() for time
    Z_TIMER_QPC,    //Uses QueryPerformanceCounter() for time (Windows Only)
};

#endif

Timer.cpp

#include "Timers.h"
#include <SDL.h>
#ifdef WINDOWS
#include <windows.h>
#endif

CTimer::CTimer() : timerType(Z_TIMER_SDL) {
    //We Default To SDL Timers, Then Reset Timers
    Reset();
}

CTimer::CTimer(unsigned int type) {
#if defined (LINUX)
    timerType = Z_TIMER_SDL;
#elif defined (WINDOWS)
    timerType = type;
#endif
    Reset();
}

CTimer::CTimer(CTimer& oldTimer) {
    //Copy Old Data Into This Class
    timerType   = oldTimer.timerType;
    timeAtStart  = oldTimer.timeAtStart;
    ticksPerSecond = oldTimer.ticksPerSecond;

    //Do Timer Initialization
    InitTimer();
    timeAtStart = GetTime();
}

void CTimer::SetTimerType(unsigned int newType) {
#if defined (WINDOWS)
    if(newType != Z_TIMER_SDL || newType != Z_TIMER_QPC)
 newType = Z_TIMER_SDL;
    timerType = newType;
#elif defined (LINUX)
    timerType = Z_TIMER_SDL;
#endif

    //Initialize The Timer
    InitTimer();
}

unsigned int CTimer::GetTimerType() {
    return (timerType);
}

void CTimer::SetUpdateInterval(float newInterval) {
    updateInterval = newInterval;
}

float CTimer::GetUpdateInterval() {
    return updateInterval;
}

float CTimer::GetFPS() {
    frames++;
    float currentUpdate = GetTime();
    
    if(currentUpdate - lastUpdate > updateInterval) {
 fps = frames / (currentUpdate - lastUpdate);
 lastUpdate = currentUpdate;
 frames = 0;
    }

    return (fps);
}

void CTimer::Reset() {
    timeAtStart  = 0;
    ticksPerSecond = 0;
    frames     = 0;
    lastUpdate   = 0;
    fps      = 0;
    updateInterval = 0.5;

    InitTimer();
    timeAtStart = GetTime();
}

void CTimer::InitTimer() {
#if defined (WINDOWS)
    if(timerType == Z_TIMER_QPC) {
 //We Only Need To Do Initialization For the QPC Timer
 //We Need To Know How Often The Clock Is Updated
 if( !QueryPerformanceFrequency((LARGE_INTEGER*)&ticksPerSecond))
    ticksPerSecond = 1000;
    }
#endif
}

float CTimer::GetTime() {
    if(timerType == Z_TIMER_SDL) {
 return ((float)SDL_GetTicks()/1000.0f);
    } 
#if defined (WINDOWS)
    else if(timerType == Z_TIMER_QPC) {
 u64  ticks;
 float time;

 //This is the number of clock ticks since the start
 if( !QueryPerformanceCounter((LARGE_INTEGER*)&ticks))
    ticks = (u64)timeGetTime();

 //Divide by frequency to get time in seconds
 time = (float)(u64)ticks / (float)(u64)ticksPerSecond;

 //Calculate Actual Time
 time -= timeAtStart;

 return (time);
    }
#endif
    return (0.0f);
}

Hope you enjoy, I look forward to any questions or comments.

9 Replies

Please log in or register to post a reply.

Fdbdc4176840d77fe6a8deca457595ab
0
dk 158 May 08, 2004 at 03:02

Nice piece of code. One comment: since you’re already doing a check on platform used, you might as well use the appropriate timer function in each platform in order to remove the dependency on SDL. It’s not a big issue though :)

0684f9d33f52fa189aad7ac9e8c87510
0
baldurk 101 May 08, 2004 at 08:23

maybe I’m missing something, but why not just use SDL on both platforms? that removes a lot of the ugly preprocessor code.

F7a4a748ecf664f189bb704a660b3573
0
anubis 101 May 08, 2004 at 09:03

using accurate and QPC in one sentence is kind of… well you know

D599631757b5f3ca924da52fdfc67e5f
0
fyhuang 101 Aug 25, 2004 at 07:41

What about a counter that doesn’t use SDL? What does SDL do to time on Linux, etc.?

0684f9d33f52fa189aad7ac9e8c87510
0
baldurk 101 Aug 25, 2004 at 12:28

@fyhuang

What about a counter that doesn’t use SDL? What does SDL do to time on Linux, etc.? [snapback]9549[/snapback]

gettimeofday() I’d imagine. As close to microsecond accuracy as a processor clock will allow.

0b8c6698d4ab77272b72ff4f5a6aa298
0
Chris 101 Jan 01, 2005 at 11:25

I hope you know that QueryPerformanceCounter is FAR from accurate ?
It’s one of the most unpredictable functions you could have used. I strongly recommend changing that code to use timeGetTime ().

timeGetTime () guarantees a 1ms resolution on XP/2000 and 5ms on 98/ME/NT.

QueryPerformanceCounter () went as bad as using the 18.2 Hz AT timer on one of my machines. On my current Athlon XP 3000+ it has a resolution of approximately 3.5 MHz (according to QueryPerformanceFrequency) and I heard of a P4 2.2 GHz user who reported something like 1.17 GHz over at flipcode. Strange numbers, those, and he reported other strange effects, too.

Ae6ea940ad993cd193e59bfd956b253b
0
Coriiander 101 May 06, 2011 at 00:58

You should definately not use the high-resolution counter offered by the Win32-API to perform your game timing. The availability and implementation vary per machine. On top of this, you simply cannot rely on the accuracy of the high-resolution counter. This is greatly affected by the current machine state, referring to your processor cores, etc. For in-depth information on the latter: use Google.

So, not only is the high-resolution counter not portable, it is also, at best, absolutely unreliable. You should not be using this device for any long-term game timing. Game timing should be absolutely precise, no excuses for ANY error whatsoever. If you don’t take it that seriously, you simply fail as a game developer. It is not just the principle, but it is also the fact that things might get pretty messed up in multiplayer environments. And more…

You could argue over the fact that Microsoft is using it themselves in all versions of their XNA Game Studios. That GameClock is based on System.Diagnostics.Stopwatch, which internally uses this device. Simple fact of the matter is, that they herewith rely on the basic architecture. This is not only plain wrong, it also brings alot of ugly, slowing-down code, with regards to overflow checking (exceptions, exceptions…), etc.

Just use GetTickCount() for your games. It doesn’t matter if 1 frame is off a few microseconds. The important thing is that over-a-range-of-frames the timing is stable. Watch your calculations carefully, especially accumulative rounding; there’s no need for that. Working with the double precision data type is advised.

If you really need higher precision timing, like 100-nanoseconds per tick, you’ll have to dive deep into assembler (once again: not portable) to make use of RTSC etc. You then gotta know your cores, machinery at heart. Just use GetTickCount() ;).

AND DON’T EVER USE Sleep()…..

6aa952514ff4e5439df1e9e6d337b864
0
roel 101 May 06, 2011 at 10:14

Thanks for digging up a 6 year old thread. Not.

It doesn’t matter if 1 frame is off a few microseconds.

That is often true. But being off a few milliseconds does matter. And that is what GetTickCount() provides you, so that is no solution.

820ce9018b365a6aeba6e23847f17eda
0
geon 101 May 06, 2011 at 10:17

I thought it was strange that someone mentioned Flipcode (RIP), but then I noticed THE THREAD IS FROM 2005!

Leave the dead alone.