# Nautilus

Member Since 19 Nov 2004
Offline Last Active Jun 12 2013 03:37 PM

### Unknown HRESULT (0x8006FC24) from DirectInput

10 June 2013 - 01:35 PM

A call to IDirectInputDevice7::GetDeviceState() is failing unexpectedly. The HRESULT code returned is being 2147941412 (0x8006FC24), which maps to nothing known. According to this Microsoft page the Facility code of the HRESULT (0x8006FC24) is undefined. How can I understand what's wrong?

### Simple Moving Average fps display, and framerate limiters...

25 May 2013 - 12:44 AM

Greetings.

With my D3D wrapper I display the framerate using a Simple Moving Average (SMA) calculated over the last hundred of frames. I'd be totally happy with it, if it weren't for a minor problem that I want to get rid of, but have no clue about how to.

THE FIRST PROBLEM:
So long a game is constant in performance, and produces its frames in pretty much the same timeframe, and maybe drops the occasional frame now and then, the SMA will display a veritable and meaningful fps information. But when a game is inconstant, produces frames the build time of can vary greatly, maybe even causing the frequent drop of frames, the SMA starts displaying funny values that just tell lies.

A few examples will illustrate the problem better than my english can.

For brevity's sake, imagine a scenario in which the frame history for the SMA is of 3 frames. And we count 900 timer ticks in 1 second. Also, normally a frame can compose in 50 ticks, but the game wants to limit itself to 3 frames per second, meaning that between composition and displaying, 300 ticks elapse from frame to frame.

In the perfect scenario we have all 3 frames to compose and display in 300 ticks each.
The Simple Moving Average (SMA) over the last 3 frames is going to be a perfect 3.0:
sma = (1 / 300) + (1 / 300) + (1 / 300)
sma = sma * 900_ticks_per_second
sma = sma / 3_frames_history
sma = 3.0 fps


In case a frame takes too much, the next frame may be dropped. Suppose that the 2nd frame takes 600 Ticks, leaving no room for a 3rd frame, which gets dropped. What will the SMA say?
sma = (1 / 300) + (1 / 600) + <nothing: this frame dropped>

--( ^^ no! because a frame dropped, the actual 3-frames window will be: )--

sma = (1 / 300) + (1 / 300) + (1 / 600)

--( hence: )--

sma = sma * 900_ticks_per_second
sma = sma / 3_frames_history
sma = 2.5 fps


Now suppose 1 of the 3 frames composes in 500 ticks. It's late, but it's possible to compensate. Normally a frame builds in 50 Ticks, and in fact the next one is on screen 100 ticks later. With all 3 frames within the 900 ticks limit, the fps of 3.0 is respected. However, the SMA will tell differently:
sma = (1 / 300) + (1 / 500) + (1 / 100)
sma = sma * 900_ticks_per_second
sma = sma / 3_frames_history
sma = 4.6 fps


It gets worse if we continue from there. Suppose that the next 2 frames are regular and compose & display in 300 ticks each.
New SMA after 1st regular frame:
sma = (1 / 500) + (1 / 100) + (1 / 300)
sma = sma * 900_ticks_per_second
sma = sma / 3_frames_history
sma = 4.6 fps


New SMA after 2nd regular frame:
sma = (1 / 100) + (1 / 300) + (1 / 300)
sma = sma * 900_ticks_per_second
sma = sma / 3_frames_history
sma = 5.0 fps

See the problem?
Add another regular frame (300 Ticks) and the SMA suddently drops to perfect 3.0 fps.

In presence of inconstant frame times it's annoying to see the SMA raising past the hard limit being imposed by the very game. Not only it gives a bad impression, it may also mask the true loss in fps that occurs when a frame is willingly dropped. I tried with different Moving Average formulae, taken from Wikipedia, but the problem persists. And in the end the Simple Moving Average is the most responsive to changes and the fastest to calculate.

----------------------------------------------------------------------------------------------------

THE SECOND PROBLEM:
If I'm allowed to briefly mention a commercial game, I find Far Cry 2 to be a good test platform for my wrapper. Its performance is most constant on my hardware (which isn't next-gen anymore) and gives me opportunity to test the robustness of my fps display. Also, the game features both an fps display calculated with SMA, and a framerate limiter.
Now, when letting the game run unconstrained, both its fps display and my wrapper's fps display show the very same readings all the time (provided that I lower my frame history to 64 entries, and make the final rounding to the 1st decimal digit).

What I haven't said is that my wrapper features a framerate limiter as well.
If I activate my wrapper's framerate limiter, both the wrapper's fps display and the game's fps display will show the very same values. No surprise. This also happens when the occasional frame is to be dropped, and the SMA jumps past the imposed limit because of it.
But let's invert the roles...
If I activate the game's framerate limiter, this time my wrapper's fps dsplay will report a slightly greater fps average (say, 40.1 or 40.2 fps) than that of the game's fps display which instead reports a perfect score (say 40.0 fps). The surprise, however, is that when frames are dropped or are just late, the game's fps display will _drop_, while my wrapper's SMA will jump past the real value - as usual.

The situation changed when in the nVidia control panel I forced the Maximum pre-rendered frames setting to 0. Now when using the game's framerate limiter, the game's fps display would dutifully declare a perfect fps amount (say 40.0 fps) but not before having shown massive initial imprecisions. Looks like it adjusted itself dynamically. My wrapper's fps display instead would calculate the same imperfect fps average of before (say, 42.6 fps). How can it be?

For all I observe the game to guess what it's doing -and how-, I come up with no explanation.
If anybody could shed some light I'd really appreciate it.

Thank you very much.

### How to detect a monitor's Default refresh frequency?

22 May 2013 - 06:09 PM

How do I retrieve a monitor's default refresh rate?

If I query for the current mode's refresh rate I may be returned either 0 or 1, both indicating a default refresh rate. But I found no way to retrieve what this default is. I know it can be overridden via the DxDiag interface - still, not even such interface will tell what's the original default.

Can I safely assume a value of 60 Hz? Apparently not. Some people have their monitors set to 59 Hertz when they activate some resolutions.
A manual count of the vertical blank intervals over a period of 1 second isn't always a viable option for me.

### [DirectX / C++] Runtime Check Failure #0

23 April 2013 - 01:43 PM

My DirectX9 wrapper won't work with a specific game, which -from what I gather- was compiled against Dx9.0c Aug2008 release (what's found on the game DVD at least). The game will startup and immediately shutdown. No sound, no popup, no error message, no log written nowhere, no Application Error in the system's events log, not even a crash dump. Nothing.

My wrapper was compiled against a slightly older version yet (Mar2008), so I figured it was time to update the SDK and rebuild the wrapper. For reasons of my own I chose to update to Dx9.0c Mar2009 release.
I recompiled my wrapper. And to ensure that I didn't break anything in this new version I have first run tests on games that I knew were working fine with it in the past (these range from indie to AAA productions). The new wapper passed all tests.

Proud and confident that it'd work with this latest game as well, I tried it... and again the game would startup then silently shutdown.
So I insert a Sleep() call inside my dll and rebuild it with debug info. Then I launch the game and connect the debugger to it. Stepping line-by-line, at some point visual studio halts with this message:

Run-Time Check Failure #0 - The value of ESP was not properly saved across a function call. This is usually a result of calling a function declared with one calling convention with a function pointer declared with a different calling convention.

Basically it's saying that the stack pointer is corrupted (on return from a function call the pointer is different from what was before the call), am I correct?
So my code is somehow stepping on forbidden ground. But I believe that the calling conventions hinted to by the message have nothing to do with this. My dll is built to double as a proxy (for when a launcher isn't an option) and if I really got the calling conventions wrong it would never work, not just this time.

Searching around the web for possible causes of the problem, I find this sample code which triggers the above error message:
class A
{
public:
virtual int doStuff (int a, int B) { return (a + B); }
};

class B
{
public:
virtual int doStuff (int a) { return a; }
};

void main (void)
{
A a;
int Stuff = ((B*) &a)->doStuff (1); // <-- Runtime Check Failure #0
}


The text accompanying the sample said that I could be using headers and libs that are out of sync (either newer header with older lib, or vice versa). I'm inclined to think this is the case. I upgraded my project from one SDK version to another (Mar2008 -> Mar2009), after all, and haven't touched a single line of code. I just recompiled.

As the error occurs on a call to IDirect3D9::GetAdapterIdentifiers(), the relevant code (I think) is wholly inside d3d9.h and d3d9.lib. So what do I do, I take a pair of such files from both SDK versions and make a binary comparison of them. Surprise: both pairs of .h and .lib files are identical.

At this point I don't know what to search for.
Any ideas?

### [C/C++] hyperbolic squared function of

03 April 2013 - 01:10 AM

Anybody can tell me how to write C code equivalent to this?

tanh2 (x)

That would mean the hyperbolic tangent squared of X.
Never heard of 'squared' functions before.

How do I translate that to C?