OK… Someone please tell me if I’m doing the math correctly here. This is in relation with my previous post entitled Spritesheet Animation in OpenGL. TheNut used a fixed frame rate in his example, but I thought I’d go with frame rate independent animation as fixed rate isn’t working for me. Lagging a bit. So, I calculated time elapsed of the current iteration and the previous iteration of the game loop and used it to calculate delta time for that interval.
Current_Time = RetrieveElapsedTime(); // Initialized in main()
New_Time = RetrieveElapsedTime(); // Called after first initialization
Delta_Time = New_Time - Current_Time;
Current_Time = New_Time;
Going by the best of my mathematical knowledge, I would say the equation for frame rate independent animation would go something like this,
⇒ Frequency ∝ 1 / Time
⇒ Frame Rate ∝ 1 / Delta Time
⇒ Frame Rate ∗ Delta Time ∝ 1
⇒ ∴ Frame Rate ∗ Delta Time = Constant
So… Going by that logic, in order to get constant animation in my sprites without a visible change in the smoothness, I should multiply the delta time I just acquired with the current frame rate of the display and use that to increment my frames in the animation. Assuming that I am right (please confirm this), here’s my question… How would I go about calculating the current frame rate of the loop? I searched the oceanic realm of the Internet, but I couldn’t find my water drop. I did read about variable frame rate drawbacks and how to limit it at a certain maximum, but couldn’t find much more than that. Please help. Thank you. :-)
Please log in or register to post a reply.
Looks like you’re calculating the delta time correctly. If you want the sprite to cycle at, say, 10 frames per second, then each frame would be shown for 1/10th of a second. Therefore you need to accumulate delta time and swap frames when the time passes 1/10th of a second.
So you’d create a float member in your sprite class to store the accumulated time since the beginning of the animation. When you create the sprite or start a new animation you’d initialize that to zero. Then, each frame, do something like this:
Sprite.Accumulated_Time += Delta_Time;
Sprite.Current_Frame = (int)(Sprite.Framerate * Sprite.Accumulated_Time)
That way, suppose Sprite.Framerate is 10, you’d have Sprite.Current_Frame come out to zero for the first 1/10th of a second, one for the next 1/10th, two for the next 1/10th, etc.
Note that the sprite’s framerate is independent of the game’s, here. The sprite will run at 10 fps always, even if the game runs 60 fps or 5 fps.
OK, Reedbeta… So, the sprite’s movement is independent of the delta time. Well, not completely independent, but not directly multiplied with the sprite’s frame rate as I thought. I didn’t really find anything on accumulated time in my search, but I completely understood what you explained above. I still have one doubt though. You say the sprite’s frame rate is independent of the game’s frame rate… Does that mean, even if I give a constant frame rate such as 10 fps, my sprite’s animation will have the same smoothness irrespective of whether the game window is running on a low frame rate or a high frame rate? Please just clear this one for me. Thanks.
Yes, the sprite’s animation will have the same smoothness. Well, if the game runs at a framerate lower than the sprite, then the sprite will be just as jerky as the rest of the game…but that would be a pretty extreme scenario. :)
I’m assuming you don’t want the sprite to speed up just because the game is running at a high fps.
Funnily… That actually happened. The sprite sped up on a higher spec PC which gave a higher frame rate, which is why I opted for variable frame rate where the frame rate is calculated based on delta time. But, since you said both frame rates are independent of each other and fixed frame rate on sprites is unaffected by window frame rate, the accumulator method seems like the ideal method to use in this case. Thanks again, Reedbeta! Your help was once again invaluable to novices like us. Up-repped your post. :-)
Hello again… Would you please elaborate more about how to use the accumulated time. I used it to animate my sprites using the method you described, but at one point the accumulated time gets incremented so much that it causes a skipping of frames. The sprite jumps from frame 1 to frame 3 directly. How would I go about doing the animation smoothly without that happening? If it is any help, I am using glutGet(GLUT_ELAPSED_TIME) to calculate delta time and accumulated time in OpenGL. Please help. Thanks.
You have to zero out when you reach the number of frames in your animation. If you have a 5 frame animation that loops, then you have to zero out your accumulated time after you play the 5th frame. Other than that, it should work. Pretty clever idea of Reedbeta, but you could just as well play the animation and zero out the accumulated time after it was greater than the frame rate and cycle the animation.
I AM doing that! Here’s what’s happening… The accumulated time for the, let’s say, 5 frames is coming out as, 1 for first frame, 2 for second frame, 4 for third frame, then 5 for fourth frame, and so on. When it gets to the third frame, it is not 3, but 4. Here is where the frame skipping occurs. Is there something that I can do in this method or use a completely new method that uses elapsed time of the loop to perform the animation? Please help. Thanks.
It should start with zero.Do something like: I can’t get the code tags to work for some reason but:
lastFrame = currentFrame;
currentFrame = //get times and change to int;
if(lastFrame!= currentFrame) //print lastFrame;
haven’t used c++ for a while, but cout lastFrame if it changes.
I’m sorry, but I didn’t get you. It’s not because of the code tags. I didn’t understand what you did there. Please explain. Thanks.
Try giving it an even slower frame rate and see if it changes anything. If you use Reedbeta’s code, you can use any frame rate. If the frame skipping stops with a slower frame rate, then you know that you aren’t getting the times fast enough. If it’s the same no matter how slow the frame rate, then you have a logic error somewhere, and it’s not in Reedbeta’s code, so you may have changed something without knowing it. What I’m wondering, and why I put that pseudo code up, was if you are sure which frame is skipping? Are you positive it’s the third frame? That’s why it would be good to have it actually print the current frame to see what’s what. I really find it hard to believe that glut can’t get the times fast enough, unless you are trying for a really high frame rate or something.
Yes, I am std::cout ing the time to see the accumulated time at every iteration. Also there are only four frames, so it is easy to see that the animation is going jittery whenever the accumulated time goes from 2 to 4. It goes from frame 2 to frame 4. Also, this is not happening all the time. Sometimes, it is smooth… Then later, it goes jittery later again. This could also be a CPU issue. I am not sure. I have to check on a different PC. But, thanks for your answer.
Be aware that text output can be incredibly slow.
Using printf for example in visual studio can take up to 120 mS. This seems to be something to do with windows display update code, I’ve never really tried to track it down, but that’s my gut feeling.
One trick you could try is using a floating accumulator. This is very simple and stops frame skip by varying the time each frame is displayed.
sprite.time += time_delta;
Using this system you will never skip frames, but the amount of time any frame of animation is displayed for can vary.
Yes, like Stainless says, you only want to test it once or so, because if you are using text every iteration, that would slow your game way down. It would be better to keep the times in an array and print them out after so long or something. Or even better, just keep a running average of the frame time and print it out when it closes.
Thanks a lot, guys! All of you… Thank you very much! I haven’t got a perfect solution and the animation is very crude. I am still trying to figure it out for myself and fine tune it. Nevertheless, I really appreciate all your help and inputs. If I have any further problems, I shall definitely ask. Thank you. :-)
You mentioned frame skipping because of timing. I don’t know what GLUT is doing behind the scenes, but it’s possible its timer resolution might not be good enough to keep up with your frame rate. Each timing function on an operating system has a minimum resolution, and it depends on the hardware as well (what works for you might not for others). Take for instance the WIN32 call timeGetTime(). It has a default 5 millisecond resolution. I think clock() has a 10ms resolution, which is quite bad for low-latency applications like frame rates and realtime sound mixing; however it has its uses and is not as expensive to call compared to higher resolution timers (ie: use the right tool for the job). On the other hand, if you want micro second accuracy you should use QueryPerformanceCounter. In fact, every game engine should have a “StopWatch” class that uses a high resolution timer. Not only can you use it for controlling frame rates, but also in physics and particle simulations as well as measuring runtime performance of your render pipeline, various function calls, etc where timing is important.