Many years ago, I was admonished to, whenever possible, release resources in reverse order to how they were allocated. That is:
block1 = malloc( ... );
block2 = malloc( ... );
... do stuff ...
free( block2 );
free( block1 );
I imagine on a 640K MS-DOS machine, this could minimize heap fragmentation. Is there any practical advantage to doing this in a C# /.NET application, or is this a habit that has outlived its relevance?
If your resources are created well, this shouldn't matter (much).
However, many poorly created libraries don't do proper checking. Disposing of resources in reverse of their allocation typically means that you're disposing of resource dependent on other resources first - which can prevent poorly written libraries from causing problems. (You never dispose of a resource, then use one that's depending on the first's existence in this case.)
It also is good practice, since you're not going to accidentally dispose a resource required by some other object too early.
Here's an example: look at a database operation. You don't want to close/dispose your connection before closing/disposing your command (which uses the connection).
Don't bother. The GarbageCollector reserves the right to defragment and move objects on the heap, so there's no telling what order things are in.
In addition, if you're disposing A and B and A references B it shouldn't matter if A disposes B when you dispose A, since the Dispose method should be callable more than once without an exception being thrown.
If you are referring to the time the destructor on the objects gets called, then that's up the garbage collector, the programming can have very little influence over that, and it is explicity non-deterministic according to the language definition.
If you are referring to calling IDisposable.Dispose(), then that depends on the behavior of the objects that implement the IDisposable interface.
In general, the order doesn't matter for most Framework objects, except to the extent that it matters to the calling code. But if object A maintains a dependency on object B, and object B is disposed, then it could very well be important not to do certain things with object A.
In most cases, Dispose() is not called directly, but rather it is called implicitly as part of a using or foreach statement, in which case the reverse-order pattern will naturally emerge, according to the statement embedding.
using(Foo foo = new Foo())
using(FooDoodler fooDoodler = new FooDoodler(foo))
{
// do stuff
// ...
// fooDoodler automatically gets disposed before foo at the end of the using statement.
}
Nested 'usings' shows you the 'outlived' is not really on, and rarely is (not to go and say never after 40 years of evidence).. And that includes the stack-based VM that runs on say CMOS.
[ Despite some attempts by MSDN.com and Duffius to make it vanish, you know manage it all for you the difference between the heap and stack. What a smart idea.. in space ]
"The runtime doesn't make any guarantees as to the order in which Finalize methods are called. For example, let's say there is an object that contains a pointer to an inner object. The garbage collector has detected that both objects are garbage. Furthermore, say that the inner object's Finalize method gets called first. Now, the outer object's Finalize method is allowed to access the inner object and call methods on it, but the inner object has been finalized and the results may be unpredictable. For this reason, it is strongly recommended that Finalize methods not access any inner, member objects."
http://msdn.microsoft.com/en-us/magazine/bb985010.aspx
So you can worry about your LIFO dispose semantics as much as you like, but if you leak one, the Dispose()'s are going to be called in whatever order the CLR fancies.
(This is more or less what Will said, above)