I just came across this post that talks about time measuring. I remember (I hope I'm not misremembering) it's an unfair competition, if this method is never called before. That is:
// At the beginning of the application
MyClass instance = new MyClass();
instance.MyMethod();
instance.MyMethod(); // Faster than the first call, because now it's warmed up.
Do we really have such warming-up theory in C#? If yes, why (what will the CLR do when warming-up)? And is everything the same if this method is an extension one (a static one)?
If by "warm up" you refer to JIT'ing then yes - if a method is never called it won't have been compiled yet, so the very first time you run it it might be slower.
Also refer to Does the .NET CLR JIT compile every method, every time?
This is due to just-in-time (JIT) compilation. If you want to improve performance and avoid this effect, the Native Image Generator (Ngen.exe) can help you.
What people are talking about here is just-in-time compilation. The code you create in .NET is stored in intermediate language, which is platform independent. When you are running the application parts of the CIL code are compiled to platform-specific instructions which takes a bit of time the first time around. Then it gets cached so the next time you call the method you don't have this time loss.
If you really want to, you can pre-compile to platform specific versions though.
It needs to be compiled, and that is why the first call is longer.
From Compiling MSIL to Native Code:
On the initial call to the method, the
stub passes control to the JIT
compiler, which converts the MSIL for
that method into native code and
modifies the stub to direct execution
to the location of the native code.
Subsequent calls of the JIT-compiled
method proceed directly to the native
code that was previously generated,
reducing the time it takes to
JIT-compile and run the code.