# Cross-Platform Accurate Timers

9 replies to this topic

### #1zenogais

New Member

• Members
• 17 posts

Posted 08 May 2004 - 02:54 AM

The following code is for a cross platform accurate timer. This timer also has the ability to calculate FPS if you so desire. For windows you can use either SDL_GetTicks() for a timer or QueryPerformanceCounter(), this of course requires SDL to be used.

Timer.h

#ifndef TIMERS_H
#define TIMERS_H

#if defined(_MSC_VER) || defined(_BORLANDC_)
#define u64 __int64
#define WINDOWS
#elif defined(__LINUX__) || defined(__MACOSX__)
#define u64 unsigned long long
#define LINUX
#elif defined(__MINGW32__)
#define u64 unsigned long long
#define WINDOWS
#endif

class CTimer {
public:
CTimer();
CTimer(unsigned int type);
CTimer(CTimer& oldTimer);

~CTimer() { }

void SetTimerType(unsigned int newType);
unsigned int GetTimerType();

void SetUpdateInterval(float newInterval);
float GetUpdateInterval();

void Reset();

float GetFPS();
float GetTime();
private:
void InitTimer();

float timeAtStart, lastUpdate, updateInterval, fps;
u64  ticksPerSecond;
unsigned int timerType, frames;
};

enum Timers {
Z_TIMER_SDL,	//Uses SDL_GetTicks() for time
Z_TIMER_QPC,	//Uses QueryPerformanceCounter() for time (Windows Only)
};

#endif


Timer.cpp

#include "Timers.h"
#include <SDL.h>
#ifdef WINDOWS
#include <windows.h>
#endif

CTimer::CTimer() : timerType(Z_TIMER_SDL) {
//We Default To SDL Timers, Then Reset Timers
Reset();
}

CTimer::CTimer(unsigned int type) {
#if defined (LINUX)
timerType = Z_TIMER_SDL;
#elif defined (WINDOWS)
timerType = type;
#endif
Reset();
}

CTimer::CTimer(CTimer& oldTimer) {
//Copy Old Data Into This Class
timerType   = oldTimer.timerType;
timeAtStart  = oldTimer.timeAtStart;
ticksPerSecond = oldTimer.ticksPerSecond;

//Do Timer Initialization
InitTimer();
timeAtStart = GetTime();
}

void CTimer::SetTimerType(unsigned int newType) {
#if defined (WINDOWS)
if(newType != Z_TIMER_SDL || newType != Z_TIMER_QPC)
newType = Z_TIMER_SDL;
timerType = newType;
#elif defined (LINUX)
timerType = Z_TIMER_SDL;
#endif

//Initialize The Timer
InitTimer();
}

unsigned int CTimer::GetTimerType() {
return (timerType);
}

void CTimer::SetUpdateInterval(float newInterval) {
updateInterval = newInterval;
}

float CTimer::GetUpdateInterval() {
return updateInterval;
}

float CTimer::GetFPS() {
frames++;
float currentUpdate = GetTime();

if(currentUpdate - lastUpdate > updateInterval) {
fps = frames / (currentUpdate - lastUpdate);
lastUpdate = currentUpdate;
frames = 0;
}

return (fps);
}

void CTimer::Reset() {
timeAtStart  = 0;
ticksPerSecond = 0;
frames     = 0;
lastUpdate   = 0;
fps      = 0;
updateInterval = 0.5;

InitTimer();
timeAtStart = GetTime();
}

void CTimer::InitTimer() {
#if defined (WINDOWS)
if(timerType == Z_TIMER_QPC) {
//We Only Need To Do Initialization For the QPC Timer
//We Need To Know How Often The Clock Is Updated
if( !QueryPerformanceFrequency((LARGE_INTEGER*)&ticksPerSecond))
ticksPerSecond = 1000;
}
#endif
}

float CTimer::GetTime() {
if(timerType == Z_TIMER_SDL) {
return ((float)SDL_GetTicks()/1000.0f);
}
#if defined (WINDOWS)
else if(timerType == Z_TIMER_QPC) {
u64  ticks;
float time;

//This is the number of clock ticks since the start
if( !QueryPerformanceCounter((LARGE_INTEGER*)&ticks))
ticks = (u64)timeGetTime();

//Divide by frequency to get time in seconds
time = (float)(u64)ticks / (float)(u64)ticksPerSecond;

//Calculate Actual Time
time -= timeAtStart;

return (time);
}
#endif
return (0.0f);
}


Hope you enjoy, I look forward to any questions or comments.

### #2Dia

DevMaster Staff

• 1121 posts

Posted 08 May 2004 - 03:02 AM

Nice piece of code. One comment: since you're already doing a check on platform used, you might as well use the appropriate timer function in each platform in order to remove the dependency on SDL. It's not a big issue though

### #3baldurk

Senior Member

• Members
• 1057 posts

Posted 08 May 2004 - 08:23 AM

maybe I'm missing something, but why not just use SDL on both platforms? that removes a lot of the ugly preprocessor code.
baldurk
He who knows not and knows that he knows not is ignorant. Teach him.
He who knows not and knows not that he knows not is a fool. Shun him.

### #4anubis

Senior Member

• Members
• 2225 posts

Posted 08 May 2004 - 09:03 AM

using accurate and QPC in one sentence is kind of... well you know
If Prolog is the answer, what is the question ?

### #5fyhuang

Member

• Members
• 45 posts

Posted 25 August 2004 - 07:41 AM

What about a counter that doesn't use SDL? What does SDL do to time on Linux, etc.?
- FYHuang
"Do, or do not. There is no 'try'." - Yoda
"Shoot Pixels not People" - Drakonite

[<a href='http://www.hytetech.com/altitude/forums/' target='_blank'>Altitude Forums</a>][<a href='http://www.hytetech.com/altitude/' target='_blank'>Altitude Technologies</a>]

### #6baldurk

Senior Member

• Members
• 1057 posts

Posted 25 August 2004 - 12:28 PM

fyhuang said:

What about a counter that doesn't use SDL? What does SDL do to time on Linux, etc.?

gettimeofday() I'd imagine. As close to microsecond accuracy as a processor clock will allow.
baldurk
He who knows not and knows that he knows not is ignorant. Teach him.
He who knows not and knows not that he knows not is a fool. Shun him.

### #7Chris

New Member

• Members
• 23 posts

Posted 01 January 2005 - 11:25 AM

I hope you know that QueryPerformanceCounter is FAR from accurate ?
It's one of the most unpredictable functions you could have used. I strongly recommend changing that code to use timeGetTime ().

timeGetTime () guarantees a 1ms resolution on XP/2000 and 5ms on 98/ME/NT.

QueryPerformanceCounter () went as bad as using the 18.2 Hz AT timer on one of my machines. On my current Athlon XP 3000+ it has a resolution of approximately 3.5 MHz (according to QueryPerformanceFrequency) and I heard of a P4 2.2 GHz user who reported something like 1.17 GHz over at flipcode. Strange numbers, those, and he reported other strange effects, too.

### #8Coriiander

New Member

• Members
• 2 posts

Posted 06 May 2011 - 12:58 AM

You should definately not use the high-resolution counter offered by the Win32-API to perform your game timing. The availability and implementation vary per machine. On top of this, you simply cannot rely on the accuracy of the high-resolution counter. This is greatly affected by the current machine state, referring to your processor cores, etc. For in-depth information on the latter: use Google.

So, not only is the high-resolution counter not portable, it is also, at best, absolutely unreliable. You should not be using this device for any long-term game timing. Game timing should be absolutely precise, no excuses for ANY error whatsoever. If you don't take it that seriously, you simply fail as a game developer. It is not just the principle, but it is also the fact that things might get pretty messed up in multiplayer environments. And more...

You could argue over the fact that Microsoft is using it themselves in all versions of their XNA Game Studios. That GameClock is based on System.Diagnostics.Stopwatch, which internally uses this device. Simple fact of the matter is, that they herewith rely on the basic architecture. This is not only plain wrong, it also brings alot of ugly, slowing-down code, with regards to overflow checking (exceptions, exceptions...), etc.

Just use GetTickCount() for your games. It doesn't matter if 1 frame is off a few microseconds. The important thing is that over-a-range-of-frames the timing is stable. Watch your calculations carefully, especially accumulative rounding; there's no need for that. Working with the double precision data type is advised.

If you really need higher precision timing, like 100-nanoseconds per tick, you'll have to dive deep into assembler (once again: not portable) to make use of RTSC etc. You then gotta know your cores, machinery at heart. Just use GetTickCount() .

AND DON'T EVER USE Sleep().....

### #9roel

Senior Member

• Members
• 698 posts

Posted 06 May 2011 - 10:14 AM

Thanks for digging up a 6 year old thread. Not.

Quote

It doesn't matter if 1 frame is off a few microseconds.
That is often true. But being off a few milliseconds does matter. And that is what GetTickCount() provides you, so that is no solution.

### #10geon

Senior Member

• Members
• 939 posts

Posted 06 May 2011 - 10:17 AM

I thought it was strange that someone mentioned Flipcode (RIP), but then I noticed THE THREAD IS FROM 2005!