My memory is 4G physical, but why I got out of memory exception even if I create just 1.5G memory object. Any ideas why? (I saw at the same time, in the performance tab of task manager the memory is not full occupied, and I could also type here -- so memory is not actually low, so I think I hit some other memory limitations)?
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
namespace TestBigMemoryv1
{
class MemoryHolderFoo
{
static Random seed = new Random();
public Int32 holder1;
public Int32 holder2;
public Int64 holder3;
public MemoryHolderFoo()
{
// prevent from optimized out
holder1 = (Int32)seed.NextDouble();
holder2 = (Int32)seed.NextDouble();
holder3 = (Int64)seed.NextDouble();
}
}
class Program
{
static int MemoryThreshold = 1500; //M
static void Main(string[] args)
{
int persize = 16;
int number = MemoryThreshold * 1000 * 1000/ persize;
MemoryHolderFoo[] pool = new MemoryHolderFoo[number];
for (int i = 0; i < number; i++)
{
pool[i] = new MemoryHolderFoo();
if (i % 10000 == 0)
{
Console.Write(".");
}
}
return;
}
}
}
In a normal 32 bit windows app, the process only has 2GB of addressable memory. This is irrelevant to the amount of physical memory that is available.
So 2GB available but 1.5 is the max you can allocate. The key is that your code is not the only code running in the process. The other .5 GB is probably the CLR plus fragmentation in the process.
Update: in .Net 4.5 in 64 bit process you can have large arrays if gcAllowVeryLargeObjects setting is enabled:
On 64-bit platforms, enables arrays that are greater than 2 gigabytes (GB) in total size.
The maximum number of elements in an array is UInt32.MaxValue.
<configuration>
<runtime>
<gcAllowVeryLargeObjects enabled="true" />
</runtime>
</configuration>
Just additional to the other points; if you want access to a dirty amount of memory, consider x64 - but be aware that the maximum single object size is still 2GB. And because references are larger in x64, this means that you actually get a smaller maximum array/list size for reference-types. Of course, by the time you hit that limit you are probably doing things wrong anyway!
Other options:
(obviously both has a performance difference compared to in-process memory)
Update: In versions of .NET prior to 4.5, the maximum object size is 2GB. From 4.5 onwards you can allocate larger objects if gcAllowVeryLargeObjects is enabled. Note that the limit for string
is not affected, but "arrays" should cover "lists" too, since lists are backed by arrays.
Just to add to the previous replies: you can go beyond the 2Gb limit on systems booted with the /3Gb [and optionally userva] boot flags.
Check that you are building a 64-bit process, and not a 32-bit one, which is the default compilation mode of Visual Studio. To do this, right click on your project, Properties -> Build -> platform target : x64. As any 32-bit process, Visual Studio applications compiled in 32-bit have a virtual memory limit of 2GB.
Each process has its own virtual memory, called an address space, into which it maps the code that it executes and the data it manipulates. A 32-bit process uses 32-bit virtual memory address pointers, which creates an absolute upper limit of 4GB (2^32) for the amount of virtual memory that a 32-bit process can address. However, the operating system requires half of it (to reference its own code and data), creating a limit of 2GB for each process. If your 32-bit application tries to consume more than the entire 2GB of its address space, it will return “System.OutOfMemory”, even though the physical memory of your computer is not full.
64-bit processes do not have this limitation, as they use 64-bit pointers, so their theoretical maximum address space is 16 exabytes (2^64). In reality, Windows x64 limits the virtual memory of processes to 8TB. The solution to the memory limit problem is then to compile in 64-bit.
However, object’s size in Visual Studio is still limited to 2GB, by default. You will be able to create several arrays whose combined size will be greater than 2GB, but you cannot by default create arrays bigger than 2GB. Hopefully, if you still want to create arrays bigger than 2GB, you can do it by adding the following code to you app.config file:
<configuration>
<runtime>
<gcAllowVeryLargeObjects enabled="true" />
</runtime>
</configuration>
You've got a max of 2Gb addressable memory as a 32bit app, as the other posters mentioned. Dont forget about overhead. You're creating an array of 93 million objects - if there happens to be 4 bytes of overhead per object that's an extra 350Mb of memory.
One more thing to be aware of; some .NET objects require 'contiguous' memory. i.e if you are trying to allocate a large array the system may need not only sufficient free memory in your process but also for all that free memory to be in one big chunk... and unfortunately the process memory gets fragmented over time so this may not be available.
Some objects/data types have this requirement and some don't... I can't remember which ones do, but I seem to recall StringBuilder and MemoryStream have different requirements.
On a 32-bit Windows Operating system the maximum 'user-mode' memory that a single application can access is 2GB... assuming that you have 4GB of memory on the box.
Unmanaged VC++ Application's memory consumption on windows server
http://blogs.technet.com/markrussinovich/archive/2008/07/21/3092070.aspx
(It's funny you asked this because I asked almost the same thing yesterday...)