Measure precision of timer (e.g. Stopwatch/QueryPe

2019-05-04 07:00发布

问题:

Given that the Stopwatch class in C# can use something like three different timers underneath e.g.

  • System timer e.g. precision of approx +-10 ms depending on timer resolution that can be set with timeBeginPeriod it can be approx +-1 ms.
  • Time Stamp Counter (TSC) e.g. with a tick frequency of 2.5MHz or 1 tick = 400 ns so ideally a precision of that.
  • High Precision Event Timer (HPET) e.g. with a tick frequency of 25MHz or 1 tick = 40 ns so ideally a precision of that.

how can we measure the observable precision of this? Precision being defined as

Precision refers to the closeness of two or more measurements to each other.

Now if the Stopwatch uses HPET does this mean we can use Stopwatch to get measurements of a precision equivalent to the frequency of the timer?

I don't think so, since this requires us to be able to use the timer with zero variance or a completely fixed overhead, which as far as I can tell is not true for Stopwatch. For example, when using HPET and calling:

var before_ticks = Stopwatch.GetTimestamp();
var after_ticks = Stopwatch.GetTimestamp();
var diff_ticks = after_ticks - before_ticks;

then the diff will be say approx 100 ticks or 4000 ns and it will have some variance too.

So how could one experimentally measure the observable precision of the Stopwatch? So it supports all possible timer modes underneath.

My idea would be to search for the minimum number of ticks != 0, to first establish the overhead in ticks of the Stopwatch that is for system timer this would be 0 until e.g. 10ms which is 10 * 1000 * 10 = 100,000 ticks since system timer has a tick resolution of 100ns, but the precision is far from this. For HPET it will never be 0 since the overhead of calling Stopwatch.GetTimestamp() is higher than the frequency of the timer.

But this says nothing about how precise we can measure using the timer. My definition would be how small a difference in ticks we can measure reliably.

The search could be performed by measuring different number of iterations ala:

var before = Stopwatch.GetTimestamp();
for (int i = 0; i < iterations; ++i)
{
    action(); // Calling a no-op delegate Action since this cannot be inlined
}
var after = Stopwatch.GetTimestamp();

First a lower bound could be found where all of say 10 of measurements for a given number of iterations yield a non-zero number of ticks, save these measurements in long ticksLower[10]. Then the closest possible number of iterations that yield tick difference that is always higher that any of the first 10 measurements could be found, save these in long ticksUpper[10].

Worst case precision would then be the highest ticks in ticksUpper minus lowest ticks in ticksLower.

Does this sound reasonable?

Why do I want to know the observable precision of the Stopwatch? Because this can be used for determining the length of time you would need to measure for to get a certain level of precision of micro-benchmarking measurements. I.e. for 3 digit precision the length should be >1000 times the precision of the timer. Of course, one would measure multiple times with this length.

回答1:

The Stopwatch class exposes a Frequency property that is the direct result of calling SafeNativeMethods.QueryPerformanceFrequency. Here is an excerpt of the property page:

The Frequency value depends on the resolution of the underlying timing mechanism. If the installed hardware and operating system support a high-resolution performance counter, then the Frequency value reflects the frequency of that counter. Otherwise, the Frequency value is based on the system timer frequency.