I'm trying to find out how much memory my objects take to see how many of them are ending up on the Large Object Heap (which is anything over 85,000 bytes).
Is it as simple as adding 4 for an int, 8 for a long, 4 (or 8 if you're on 64 bit) for any reference types etc for each object, or are there overheads for methods, properties etc.
Don't forget that the size of an actual object doesn't include the size of any objects it references.
The only things which are likely to end up on the large object heap are arrays and strings - other objects tends to be relatively small in themselves. Even an object with (say) 10 reference type variables (4 bytes each on x86) and 10 GUIDs (16 bytes each) is only going to take up about 208 bytes (there's a bit of overhead for the type reference and sync block).
Likewise when thinking about the size of an array, don't forget that if the element type is a reference type, then it's only the size of the references that count for the array itself. In other words, even if you've got an array with 20,000 elements, the size of the array object itself will only be just over 80K (on x86) even if it references a lot more data.
Unless it's a huge valuetype or instance type (i.e. many thousands of fields), the only types you need to worry about are large arrays, or strings. Of course, to figure out the size of an array, you need to know the element size.
.NET (currently) aligns types in much the same way that native compilers align types. Fundamental types have natural alignments that are usually the rounded-up integral power of two closest to their size:
When assembling a type, the compiler will make sure that all fields of any given type have their starting offset within the instance aligned to a boundary that matches that type - assuming that explicit layout isn't being used.
User-defined types themselves have an alignment, which is calculated as the highest alignment of any of their field types. The type's size is extended if necessary to make the size of the type aligned too.
But of course, all reference types are still only IntPtr.Size in size and alignment, so the size of reference type will not affect arrays of that type.
Note that the CLR may choose, at its discretion, to layout types differently than described above, maybe to increase cache locality or reduce padding required by alignment.
Gomes's method simplified:
go to Visual Studio(2010) Project Properties - > Debug tab -> Enable unmanaged code debugging.
Set the break point in your code, start debugging(F5).
Open Debug -> Windows -> Immediate Window.
enter .load sos
enter (replace myObject with the name of your object)
6. Use result as parameter of !ObjSize
see: SOS.DLL, object Address and Visual Studio debugger Introduction
Example (we are looking for object named
tbl
):If you can - Serialize it!
As an estimate (in 2017) you can debug into your application, set a breakpoint before your dictionary comes to live, take a "Memory Usage Snapshot" (Tab: Memory Usage under Diagnostic Tools) , fill your dictionary and get another snapshot - not exact put a good gestimate.
Please follow these steps to get the size of the object.
1) go to Visual Studio(2010) Project Properties - > Debug tab -> Enable unmanaged code debugging.
2) go to Visual Studio Debug menu -> Option and Settings -> Debugging -> Symbols.
3) There enable Microsoft Symbol Server,leave the default.(symbols may start download)
4) Set the break point in your code, start debugging(F5).
5) Open Debug -> Windows -> Immediate Window.
6) enter .load sos.dll(Son of Strike)
7) enter !DumpHeap -type MyClass (the object you want to find size)
8) from the out put locate the address of the object i.e.(00a8197c)
Address MT Size 00a8197c 00955124 36
9) Next, !ObjSize 00a8197c
10) There you go -> sizeof(00a8197c) = 12 (0x48) bytes (MyClass)