I am fairly new to programming and with my limited knowledge of OOP have decided to use events to communicate between my classes. Naturally, this will lead to quite a few events.
I wanted to know if there is any additional overhead to using events? I would assume that unless an event is acted upon (i.e, there is a listener in a class that executes a function based on the event being fired), then there shouldn't really be much of an impact. But I am not intimately familiar with events in C# and wanted to just confirm if there is significant extra overhead simply for firing off an event?
Yes, there is overhead. Yes is can be significant. This is not hard to prove either. An event is a multicast delegate, delegates, like a method, block until finished.
These are actual timings comparing the two for doing the same amount of work: Using events Time to complete 271271.687 milliseconds = 271.271687 seconds
Not using events Time to complete 123214.514 milliseconds = 123.214514 seconds
Use events where events are the best "fit".
In general, this is true. The standard pattern for raising an event will do nothing but check the invocation list to see if there are subscribers, and return if there are no listeners, which will be very minor in terms of performance in most cases, and is likely not going to be an issue.
Even when there are subscribers, the overhead of using events is still fairly minor, as they are effectively calling the subscribers handlers by a delegate invocation, which is still relatively fast (though slower than a direct method call). As long as you're not doing this in very tight loops, it's likely to be insignificant.
That being said, this, like any performance question, boils down to measurement. Without actually measuring in your case, there's no way to know how significant the actual usage would be in practice. In order to know for certain whether this is a reasonable design choice in your specific case, you'd need to profile the usage of the events to make sure that it is not a significant issue.
Well, yes - you have a
MulitcastDelegate
involved, code checking to see if an event handler is actually attached, etc.Ahh - the real question. There is overhead, but is it significant? That can only be answered by measuring it.
My guess is that any overhead you experience won't be significant (otherwise there would be warnings about using events in performance-critical applications, which I have not seen) and that there are other parts to your application that have a bigger effect on performance.
I'm not sure if anyone has quantified the overhead, but it's probably quite small for most purposes. If you're new to programming, then you're probably not aiming to write ultra high-performance code at first (and if you were, you probably wouldn't be using C#, right?)
One thing to beware of is the scope of a published event. You can end up in a situation where there are dozens of subscribers to a given event, but really only one or two actually care about a given instance of the event. This can lead to a significant overhead. In this context, it may be worthwhile looking into the
System.Observable
reactive-programming paradigm. This can help limit over-broadcasting by allowing you to only invoke handlers on those subscribers that actually care about a given event.In case anyone stumbles upon this question so many years after, I've used the BenchMarkDotNet framework to measure the time an event takes to invoke. I've made a difference between 1 subscriber and 100 subscribers.
Testcode used:
Testresults: