It is unclear to me how the compiler will automatically know to compile for 64-bit when it needs to. How does it know when it can confidently target 32-bit?
I am mainly curious about how the compiler knows which architecture to target when compiling. Does it analyze the code and make a decision based on what it finds?
Microsoft has a blog entry What AnyCPU Really Means As Of .NET 4.5 and Visual Studio 11:
In .NET 4.5 and Visual Studio 11 the cheese has been moved. The
default for most .NET projects is again AnyCPU, but there is more than
one meaning to AnyCPU now. There is an additional sub-type of AnyCPU,
“Any CPU 32-bit preferred”, which is the new default (overall, there
are now five options for the /platform C# compiler switch: x86,
Itanium, x64, anycpu, and anycpu32bitpreferred). When using the \"Prefer 32-Bit\"
flavor of AnyCPU, the semantics are as follows:
- If the process runs on a 32-bit Windows system, it runs as a 32-bit process. IL is compiled to x86 machine code.
- If the process runs on a 64-bit Windows system, it runs as a 32-bit process. IL is compiled to x86 machine code.
- If the process runs on an ARM Windows system, it runs as a 32-bit process. IL is compiled to ARM machine code.
The difference, then, between “Any CPU 32-bit preferred” and “x86” is
only this: a .NET application compiled to x86 will fail to run on an
ARM Windows system, but an “Any CPU 32-bit preferred” application will
run successfully.
The reason is: in case you don\'t want to use more memory with 64 bit applicatios. Which means, if your application is AnyCPU, you want to run as 32 bit.
To add more, the setting in Visual Studio targets the particular CLR:
Visual Studio installs the 32-bit version of the CLR on an x86 computer, and both the 32-bit version and the appropriate 64-bit version of the CLR on a 64-bit Windows computer. (Because Visual Studio is a 32-bit application, when it is installed on a 64-bit system, it runs under WOW64.)
Please refer to the article 64-bit Applications (MSDN).