In our application, we are using the Windows Performance Counters to store some of our application metrics, which are later retrieved in some web services.
I'm having an issue with the amount of time it takes to read the value from the counters. I've looked through the rest of my app and everything is fine, performance wise, but reading from the counters within a loop (from List or array) takes a painful amount of time.
Example code:
// This triggers a read of the counter's initial value (1000ms delay following for calculated counters)
counters.ToList().ForEach(counter => counter.NextValue());
In my testing of just the loop above, a list of 1,359 counters takes 20 seconds, and with a stopwatch in place, it seems the average time to read a counter value is either 0-10ms, or about 80-90ms. A lot of them take 0ms, the highest being about 170ms, with the average non-zero being about 80-90ms.
Maybe I am being too optimistic, but I would have figured that reading 1,000 numeric values should take only a few milliseconds. Is there a lot more processing going on here than I'm aware of?
I actually have another loop later in my logic that gets a second value for calculated counters. That just makes this doubly worse. :)
Thanks!
Update 1
I wrapped the counter retrieval in a stopwatch and I'm surprised by the results. Reading even the simple property of .RawValue
still takes an excessive amount of time. It's my understanding that counters all basically work the same, and retrieval should be incredibly fast; strangely I'm also seeing a pattern where counters for Networking categories take longer.
According to http://joe.blog.freemansoft.com/2014/03/windows-performance-counters.html, performance for the Performance Counter service shouldn't even be a consideration.
I've posted some stopwatch results to the following pastebin: http://pastebin.com/raw.php?i=aDJk2Tru
My code is as follows:
Stopwatch t;
foreach (var c in counters)
{
t = Stopwatch.StartNew();
var r = c.RawValue;
Debug.WriteLine(t.ElapsedMilliseconds.ToString("000") + " - " + c.CategoryName + ":" + c.CounterName + "(" + c.CounterType + ") = " + r);
}
As you can see in the paste, a lot of reads are 0, but there are a lot in the 50-100ms range. I don't really understand how that can be. Surely one counter value should be just as fast as any other, right?
Here’s what I've been able to find out about the counters. Please forgive the grammar; this was somewhat extracted from an email I sent out regarding this problem.
- There is a 4-5 second processing time, on my machine at least (may be better or worse on server, not sure), to read the instance names from a counter category. This varies negligibly with the number of counters in a category. If you are not using instance counters, you can avoid this.
- We store all of the counters in a single category, so it’s inevitable that category will eventually end up with thousands of counters, given our situation. In my testing, the more counters in a category, the worse the performance. This seems like it should make sense, but the performance of an individual counter is affected by the number of counters currently in memory, which is an odd correlation, maybe:
- With 8 total counters, read time is about 1-2ms per counter
- With 256 total counters, read time is about 15-18ms per counter
- With 512 total counters, read time is about 30ms per counter
- With 3,584 total counters (reading all counters), read time is about 200ms per counter
- With 3,584 total counters in the system (filtered down in memory, reading only 512 counters), read time is anywhere from 50-90ms per counter. Not sure why these are slower than the previous batch of 512 counters.
- I ran each of these tests a few times using
System.Diagnostics.Stopwatch
to time them.
- Of importance to note is the fact that counters have to be read twice because many counters are calculated over a span of time and present an average between start and end read times, so these bad numbers are made worse in a real-world scenario.
Given the numbers above, on my machine, with 512 counters at roughly 50ms each on the slower end, plus the instance query, and the second counter read, we’re looking at about 60 seconds per request. This is given that we’re working with only 512 counters at a time. I’ve run the full query against the service on my machine several times and the request consistently completes in 60-65 seconds.
I certainly would not have assumed this type of performance degradation of single counters based on the number of other counters being assessed. In my reading, the Windows Performance Monitor system is supposed to be fast, and with small collections it certainly is. It’s possible that our use case is not a good fit and we may be abusing the system.
Update
Given that we have control over how we create counters, we have decided to change our approach a bit. Instead of a few categories with many counters, we instead create many categories, each having fewer counters (4-8 counters per category). This approach has allowed us to effectively avoid the performance issue, and counter read times are in the 0-1ms range. In our experience so far, having even 100 new categories with a few counters each does not affect performance in the system at all.
It's important to note when dealing with a large number of additional counters, you will need to address the memory limitation that is set by default for Performance Counters. This can be done either via machine.config or a registry entry. More information can be found here: http://msdn.microsoft.com/en-us/library/ms229387(v=vs.110).aspx