Our application is built using WinForms on .NET 3.5
This was tested specifically on Windows XP, although i am not sure whether it is OS related.
Whenever i run it on a machine with .NET 4 only, it will crash, saying that it cannot find the .NET assemblies versioned 3.5.
I am wondering why this does not happen automatically. For example, some 3rd party libraries install a policy of some sort, that when a newer version is installed, it will be used, even if your application was compiled against an older version.
UPDATE: The exact error message is:
"Unable to find a version of the runtime to run this application".
My questions are:
- Why is this not the same with the .NET framework?
- The solution is to add the element in the configuration file? Any other solutions?
.NET 4.0 introduce new CLR. So basicly having 4.0 doesn't help much in running 3.5 apps. It has been already mentioned here
If your computer has no .NET 3.5 installed there is no CLR to start for your app. .NET 4.0 is not automatically used for your app because of potential compatibility issues. First test that your app does run with .NET 4.0 and then add this section to your app.config to tell the CLR to prefer running .NET 4.0 if present.
If .NET 4 is not present as fallback the CLR version against which you application was compiled against is used as fallback. If all fails you get the message "Unable to find a version of the runtime to run this application".
In general, that should not be the case:
SUGGESTIONS:
1) Read these links:
http://msdn.microsoft.com/en-us/library/ff602939.aspx
http://msdn.microsoft.com/en-us/library/dd889541.aspx
http://msdn.microsoft.com/en-us/library/ee461502.aspx
2) See if you can reproduce the problem with a minimal .Net 3.5 Winforms app
3) Cut/paste the exact error message, and any relevant code (e.g. from your standalone app)
4) Failing all else, consider adding
<startup useLegacyV2RuntimeActivationPolicy="true" >
to your CLR startup configuration. I would not recommend this as anything but a temporary workaround: