So I am simulating this smartphone app to Windows. It's a game that runs it's logic and draw methods at a 1/60
rate. In milliseconds this is 16.6667
I've implemented this game loop:
private const double UPDATE_RATE = 1000d / 60d;
private void GameLoop()
{
double startTime;
while (GetStatus() != GameStatus.NotConnected)
{
startTime = Program.TimeInMillis;
//Update Logic
while (Program.TimeInMillis - startTime <= UPDATE_RATE)
{
//Thread.Yield(); it consumed CPU before adding this too, adding this had no effect
Thread.Sleep(TimeSpan.FromTicks(1));//don't eat my cpu
}
}
Debug.WriteLine("GameLoop shutdown");
}
Program.TimeInMillis
comes from a NanoStopwatch
class, that return the time in millis in double
. Which is useful because this gameloop has to be really accurate to make everything work.
As you probably know, the Thread.Sleep
will consume a lot of CPU here. Probably because it requires int millis
which will convert my TimeSpan
to 0 milliseconds.
I recently came across this thing called WaitHandle
, but everything I see uses int millis
. I really need to make this accurate so I actually need double millis
or microseconds/nanoseconds
.
Is there anything in C# that allows me to Wait
for x amount of microseconds. I don't mind creating something myself (like I did for the NanoStopwatch
), but I need to be pointed in the right direction. Found a lot on the internet, but everything uses int millis
and that's not accurate enough for me.
I ran some tests changing the Sleep
to use something like: UPDATE_RATE - (Program.TimeInMillis - startTime)
which isn't making my gameloop accurate.
I hope everything is clear enough, if you need more information just leave a comment.
tl;dr - Is there any way to let a thread wait for an amount of microseconds/nanoseconds
in C#
? My gameloop needs to be accurate and milliseconds isn't accurate enough for me.
Thanks in advance!
You need hardware support for this kind of accuracy. A signal that generates an interrupt and trigger code to get the job done. There are two obvious candidates for such a signal in Windows.
The first one is the VSYNC interrupt, a video device driver implementation detail. It occurs at the monitor refresh rate, usually 60 Hertz for LCD monitors. Just what you are asking for of course, no coincidence. Most programmers use DirectX to implement game graphics, you ask it to use the signal with the D3DPRESENT setting. Otherwise very unclear why you are not using it from the question, but the solution you should pursue.
The second one is the clock interrupt. That's the one you are complaining about. A big deal on Windows, that's the signal that wakes up the thread scheduler and gets code to run. It is directly responsible for the accuracy of Thread.Sleep(). Code that sleeps cannot start running again until the clock interrupt occurs and the scheduler puts the thread back into the active state. There's no other way to do it, the processor is physically turned off with the HLT instruction, consuming no power, and can only be woken up with an interrupt. By default, the clock interrupt ticks 64 times per second. Once every 15.625 millisecond. It tends to get tinkered with by drivers and whatnot, they often reprogram it to tick at 10 millisecond. Companies that give out free software and have a stake in making their own products look good make it as low as 1 millisecond.
Which is what you need to do as well if you can't tame DirectX for some reason. Pinvoke timeBeginPeriod(1) to get 1 msec accuracy. Technically you can go as low as 500 microseconds with NtSetTimerResolution() but that's a low as the dial will go. And do keep in mind that you can't sustain such rates consistently, thread scheduling quanta, garbage collection pauses, hard paging faults and obnoxious device drivers that run their code at realtime priority take much longer than that.