I've heard the advice that you should avoid try catch blocks if possible since they're expensive.
My question is specifically about the .NET platform: Why are try blocks expensive?
Summary of Responses:
There are clearly two camps on this issue: those that say that try blocks are expensive, and those that say "maybe a tiny little bit".
Those that say try blocks are expensive normally mention the "high cost" of unwinding the call stack. Personally, I'm not convinced by that argument - specially after reading about how exceptions handlers are stored here.
Jon Skeet sits on the "maybe a tiny little bit" camp, and has written two articles on exceptions and performance which you can find here.
There was one article that I found extremely interesting: it talked about "other" performance implications of try blocks (not necessarily memory or cpu consumption). Peter Ritchie mentions that he found that code inside try blocks is not optimized as it'd otherwise be by the compiler. You can read about his findings here.
Finally, there's a blog entry about the issue from the man that implemented exceptions in the CLR. Go take a look at Chris Brumme's article here.
Every try needs to record a lot of information, e.g. stack pointers, values of CPU register, etc. so it can unwind the stack and bring back the state it has been when passing the try block in case an exception is thrown. Not only that every try needs to record a lot of information, when an exception is thrown, a lot of values needs to be restored. So a try is very expensive and a throw/catch is very expensive, too.
That doesn't mean you shouldn't use exceptions, however, in performance critical code you should maybe not use too many tries and also not throw exceptions too often.
This is not something I would ever worry about. I would rather care about the clarity and safety of a try...finally block over concerning myself with how "expensive" it is.
I personally don't use a 286, nor does anyone using .NET or Java either. Move on. Worry about writing good code that will affect your users and other developers instead of the underlying framework that is working fine for 99.999999% of the people using it.
This is probably not very helpful, and I don't mean to be scathing but just highlighting perspective.
I think people really overestimate the performance cost of throwing exceptions. Yes, there's a performance hit, but it's relatively tiny.
I ran the following test, throwing and catching a million exceptions. It took about 20 seconds on my Intel Core 2 Duo, 2.8 GHz. That's about 50K exceptions a second. If you're throwing even a small fraction of that, you've got some architecture problems.
Here's my code:
I doubt if they are particularly expensive. A lot of times, they are necessary/required.
Though I strongly recommend using them only when necessary and at the right places / level of nesting instead of rethrowing the exception at every call return.
I would imagine the main reason for the advice was to say that you shouldnt be using try-catches where if---else would be a better approach.
It's not the block itself that's expensive, and it's not even catching an exception, per se, that's expensive, it's the runtime unwinding the call stack until it finds a stack frame that can handle the exception. Throwing an exception is pretty light weight, but if the runtime has to walk up six stack frames (i.e. six method calls deep) to find an appropriate exception handler, possibly executing finally blocks as it goes, you may see a noticeable amount of time passed.
The compiler emits more IL when you wrap code inside a try/catch block; Look, for the following program :
The compiler will emit this IL :
While for the slightly modified version :
emits more :
All these NOPs and others cost.