TIGSource Forums

Developer => Technical => Topic started by: happymonster on July 15, 2012, 02:44:51 AM



Title: Using Sleep in Games to return CPU time to Windows
Post by: happymonster on July 15, 2012, 02:44:51 AM
I'm trying to find a balance of time to sleep so that the game runs smoothly but the CPU is not working harder than necessary. This is to save battery and reduce fan noise as well as give the OS more time for it's work.

However, since every PC / Laptop is different, would the time I need to sleep not also vary? If this is correct, how do people deal with this issue in their games?


Title: Re: Using Sleep in Games to return CPU time to Windows
Post by: _Tommo_ on July 15, 2012, 02:49:46 AM
It's simple, do not use sleep (http://www.altdevblogaday.com/2012/06/05/in-praise-of-idleness/) :beer:
If you can, rely on a some timer that wakes your game at a fixed interval, so you get the best wake-up precision and the least performance hit.


Title: Re: Using Sleep in Games to return CPU time to Windows
Post by: Liosan on July 15, 2012, 02:55:48 AM
Using vsync kinda solves your problem. You won't get more FPSes than you need.

Liosan


Title: Re: Using Sleep in Games to return CPU time to Windows
Post by: Nix on July 15, 2012, 03:08:14 AM
usually what you do is pick an FPS and then yield so that you hit that FPS. then you can give the player the option to turn that off and have an unbounded FPS if that's what they want


Title: Re: Using Sleep in Games to return CPU time to Windows
Post by: happymonster on July 15, 2012, 03:19:59 AM
I am currently using a timer to get a fixed FPS, but not clear about the vsync option. Is this a part of DirectX / OpenGL rather than Windows?


Title: Re: Using Sleep in Games to return CPU time to Windows
Post by: PompiPompi on July 15, 2012, 03:30:04 AM
Yes, both DX and OpenGL have VSynch.


Title: Re: Using Sleep in Games to return CPU time to Windows
Post by: cskau on July 15, 2012, 03:48:17 AM
The modern, multi-process OS kernel handles when a process has had its turn and pauses it when, so trying to be too clever and take over too much of the managing from within the process itself is useless.
I'd say just run your process at full speed all the time.

Of course that's not what you're asking for so second to that I'd also say have an FPS bound cycle. Something like "render 60 frames, then wait for the rest of this second".

The problem with sleep() is that the scheduler does not make any guarantees that it will strictly obey the timeout of the sleep() call if it feels there are more important things to do for now.

Kernel scheduling is a very interesting subject and I recommend reading up on it a bit if you feel like upping your computer smarts a bit. _Tommo_'s linked article seems like a good introduction. :)


Title: Re: Using Sleep in Games to return CPU time to Windows
Post by: _Tommo_ on July 15, 2012, 04:38:32 AM
I am currently using a timer to get a fixed FPS, but not clear about the vsync option. Is this a part of DirectX / OpenGL rather than Windows?

I mean a timer (https://developer.apple.com/library/mac/#documentation/Cocoa/Reference/Foundation/Classes/nstimer_Class/Reference/NSTimer.html), not a timer (http://www.cplusplus.com/reference/clibrary/ctime/).
I don't know the exact names of both, but with the first (the one that's "correct") you don't need to check if the frame interval has passed.

The vsync is something like this, in fact. Vsync ensures that the frame rate of your game matches the refresh rate of your monitor, so it works like a timer that wakes your game at a fixed rate.

If you can, you should use vsync to ensure a framerate as ideally any game should want to match the screen refresh rate, but if your FPS is slower than that, the vsync can break down a little and you end up switching between 30 and 60 FPS.


Title: Re: Using Sleep in Games to return CPU time to Windows
Post by: Xecutor on July 15, 2012, 05:05:14 AM
I'm using something like this:

Code:
  timer_t delay=1000000/frameRate; //delay between frames in microseconds
  timer_t overDelay=0;
  for(;;)
  {
    timer_t startTime=getTimer(); //getTimer returns timer time in microseconds
    drawFrame();
    timer_t endTime=getTimer();
    timer_t opTime=endTime-startTime;
    if(delay>opTime+overDelay)
    {
      int64_t toSleep=delay-opTime-overDelay;
      startTime=getTimer();
      delay(toSleep/1000);/sleep in milliseconds
      endTime=getTimer();
      overDelay=endTime-startTime-toSleep;
    }else
    {
      overDelay-=delay-opTime;
      if(overDelay < -delay)
      {
        overDelay=-delay;
      }
    }
  }


Title: Re: Using Sleep in Games to return CPU time to Windows
Post by: Crimsontide on July 15, 2012, 01:39:12 PM
I've read alot about CPU timing and thread scheduling (even wrote my own OS from scratch for my Comp Sci degree).  There are alot of 'do not do this' but very little 'this will work' out there on the web.  Despite the fact that CPU makers are really pushing multi-core, current OS's are just a complete mess when it comes to reliable, predictable, multi-threaded programming.  Its so bad it'd be laughable if we didn't have to then somehow get this whole mess to work.

Despite the fact that I know I'm not supposed to, I still tend to use Sleep(0) and just call timeBeginPeriod(1) and timeEndPeriod(1) when the program starts and finishes.  This goes completely contrary to what I've read (in a number of places) and yet still seems to work the best on my machine.  CPU usage stays VERY low (unless I'm actually processing something important), my fans spin down, CPU stays cool, and multi-threaded programs stay snappy (if thats a good word).

In a perfect world I'd have much better methods for handling multi-threading, and I hope that future OS's (windows, IOS, andriod, linux, or otherwise) actually start paying attention to latency when it comes to API and driver development.  In the meantime perhaps I will play around with YieldProcessor()some more...


Title: Re: Using Sleep in Games to return CPU time to Windows
Post by: Evan Balster on July 15, 2012, 03:02:35 PM
I think we're giving this guy horribly overcomplicated answers.

What you want is framerate regulation.  There are plenty of simple ways to do it, and many of them have little flaws, but work well enough.


A simple technique: at the end of each iteration of the main loop, track how long the frame took using whatever sort of clock you have at hand; in SDL this is done by checking SDL_GetTicks every frame, remembering its value for the next frame and taking the difference for the current frame.  Figure out how long each frame should take -- 33 milliseconds for ~30 FPS, for instance.  Then subtract the time already taken from the ideal time and delay (eg. SDL_Sleep) by that amount.  This isn't precise but ends up producing a framerate very close to the desired one.

Implementing a time accumulator to counteract error in the sleep function improves regulation a little more, and is necessary for a delay-based regulator to play nice with vsync, but it's probably more than you need until such time as you're shipping something.


Title: Re: Using Sleep in Games to return CPU time to Windows
Post by: xgalaxy on July 15, 2012, 08:17:33 PM
http://gafferongames.com/game-physics/fix-your-timestep/



Title: Re: Using Sleep in Games to return CPU time to Windows
Post by: PompiPompi on July 15, 2012, 08:37:45 PM
This is not about FPS regulation, it's about not having a thread that chokes the CPU because it always does work.
Having a timer that counts how much time has passed and only then render would not solve it because the counting itself ogf time would choke the CPU(because it will occur endlessly).
What he really wants is VSync, how he decides to advance and requlate his physics and animation is another issue.
He specifically said he doesn't want to choke the CPU and without VSync he won't be able to do that, unless he use sleep but we already agreed sleep is not a good solutin.


Title: Re: Using Sleep in Games to return CPU time to Windows
Post by: Evan Balster on July 15, 2012, 09:49:12 PM
Sleep is a fine solution.  It's not perfectly precise but it's a heck of a lot more straightforward than vsync, which in many circumstances varying by platform is emulated with sleep or entirely unsupported for certain context modes.