Is there any difference building / running an application targeting .Net 4 if you have installed only .Net 4 Framework or .Net 4.5 Framework?
A colleague of mine said that even if the application targets .Net 4 installing the 4.5 makes a difference and I'm unsure.
NET 4.5 is an "in-place upgrade" of .NET 4, so even if you target .Net 4 you will be using .NET 4.5, see http://www.hanselman.com/blog/NETVersioningAndMultiTargetingNET45IsAnInplaceUpgradeToNET40.aspx.
Summarising this blog, there are three major version of .NET that can be installed side by side:
- .NET 1 (1.1)
- .NET 2 (2/3/3.5)
- .NET 4 (4/4.5)
The "minor" versions are in place upgrades
I've had massive problems with HgLab (which is an on-premise ASP.NET application) targeting .NET 4.0 and being built on a server with .NET 4.5 installed.
I don't use anything fancy, only stable public APIs and explicitly target .NET 4.0. Yet it kept on crashing deep inside kernelbase.dll
on 64-bit systems, producing indecipherable crashdumps and generally behaving very weirdly.
What I ended up doing was enabling 32-bit application support in 64-bit version of IIS - this seems to have solved the problem. But generally, if you want to really target .NET 4.0, do your builds on a build server with only .NET 4.0 installed.
Yes, here's an example:
The following console app should throw an UnobservedTaskException
on a system without .Net 4.5 or later installed, but it will loop forever if .Net 4.5 or later has been installed:
private static void Main()
{
Task.Factory.StartNew(() => { throw new InvalidOperationException("Erk"); });
while (true)
{
Thread.Sleep(100);
GC.Collect();
GC.WaitForPendingFinalizers();
}
}
You're supposed to be able to configure this behaviour in App.Config
according to Microsoft, but it doesn't work for me at all.