In Visual Studio you can set the "Target framework" for your project.
It is more or less common knowledge that if you set "Target framework" to (for example) .NET 4.5.2 the application won't run when on a machine that has only .NET 4.5.1 installed.
First question: Is that really true? Second question: Are there any other effects of that setting?
In my company we are setting the minimum requirement for an application to .NET 4.5.2 at the moment. And thus we are settings the "Target framework" of course. An internal library we are using is set to a "Target framework" of .NET 4.5. And we were wondering if that even made a difference or if the library should also be set to .NET 4.5.2.
In my opinion it should not matter but I didn't find any resources on that topic. What do you think?
The frameworks are designed to be backwards-compatible; if you have a program written in .NET 2.0, you can run it in the 4.0 runtime, because none of the frameworks ever remove functionality that a prior version had (which is why we still have the non-generic collections like ArrayList, even though they're deprecated in favor of generic collections). However, the reverse is not necessarily true; a 4.0 app is not guaranteed to run in 2.0, because it MAY take advantage of new features of the new runtime that are not available in prior versions. In any case, if you want your app to attempt to run on runtime versions it does not specifically target, you must specify that in the app.config using SupportedRuntime elements.
First question:
Short answer - yes.
Whenever the framework is updated there is no guarantee what additions or bug fixes or changes might have been made, it is the same principle with any sort of software update. A good example of what I mean by this might be an application that changes it's config settings by adding or removing some, and uses those settings in it's code. If you then try and replace the updated config file with the previous version's, the application would break right? It is the same principle here. If you try using an application that requires a newer version of the .NET framework than you are currently running, then for example, some features may not be there that your application relies on. Hence .NET applications will only look for their minimum .NET version and nothing below that.
Second question:
If you target earlier versions of the framework, such as .NET 2.0, you won't get features such as LINQ for example, but if you want an application to run on almost any computer with .NET installed then this is your safest option.
hope this helps!
First question: Is that really true?
It depends. If your application is targeted for 4.5.2 but doesn't use anything that is in 4.5.2 but not in 4.5.1, then theoretically it will run fine on a machine with only 4.5.1 installed. (There may be check the program/installer runs to check that 4.5.2 is installed, but you can disable this)
However, if your application uses functions (or whatever) that are in 4.5.2 but NOT in 4.5.1, then your application will not run on the 4.5.1 machine.
As a general rule, you should only set the target to be the minimum that is required. I.e. if you don't use 4.5.2 specific functions, then don't target it for 4.5.2 as there is no need. It is always better to set the target to 4.5.1 if you can, as you can run it fine on both 4.5.2 and 4.5.1 machines. Lower target the better as the chances are it will run on more machines.
The same rule goes for all other frameworks as well, (I.e. if your Android app only needs level 18 API, then set the target (or at least the minimum target) to be 18 rather than anything higher.