I'm using Entity Framework in .NET Framework 4.0 in an application.
After upgrading the operating system to a 64 bit version I noticed higher CPU usage in the application.
Compiling the application specifically for x86 (instead of any CPU as before) the application got back to about the same CPU usage as before the operating system upgrade.
I did some more detailed measuring using the below code
Some warm-up code is run before so that the overhead of creating the context the first time and running the query for the first time is not measured. Those cases are not very interesting for my application since it's a long running application
var s = Stopwatch.StartNew();
var name = "name";
for (var i = 0; i < 100000; ++i) {
using (var context = new MyDatabaseEntities()) {
var entity = context.MyEntities.FirstOrDefault(e => e.Name == name);
}
}
s.Stop();
The code above compiled either for x86 or x64 (Any CPU gives the same results as x64) and run on a Windows 7 64 bit machine. (The database is running on a different machine)
Gives a performance difference of 12% in x86s favor. I.e. if the x86 version runs 100 queries per second, then the x64 version runs 88 queries per second.
Is this an expected/normal performance difference between 32bit and 64bit on .NET?
Is there something I can do to get the same performance in the 64 bit version as in the 32 bit version?
In the above example the MyEntity is a very simple entity with an Id and a Name
Normally the application is run as a windows service, but when measuring, it was run as a normal windows application (WPF)