I was looking for a easy way to know bytes size of arrays and dictionaries object, like
[ [1,2,3], [4,5,6] ] or { 1:{2:2} }
Many topics say to use pylab, for example:
from pylab import *
A = array( [ [1,2,3], [4,5,6] ] )
A.nbytes
24
But, what about dictionaries? I saw lot of answers proposing to use pysize or heapy. An easy answer is given by Torsten Marek in this link: Which Python memory profiler is recommended?, but I haven't a clear interpretation about the output because the number of bytes didn't match.
Pysize seems to be more complicated and I haven't a clear idea about how to use it yet.
Given the simplicity of size calculation that I want to perform (no classes nor complex structures), any idea about a easy way to get a approximate estimation of memory usage of this kind of objects?
Kind regards.
There's:
But I wouldn't say it's that reliable, as Python has overhead for each object, and there are objects that contain nothing but references to other objects, so it's not quite the same as in C and other languages.
Have a read of the docs on sys.getsizeof and go from there I guess.
a bit late to the party but an easy way to get size of dict is to pickle it first.
Using sys.getsizeof on python object (including dictionary) may not be exact since it does not count referenced objects.
The way to handle it is to serialize it into a string and use sys.getsizeof on the string. Result will be much closer to what you want.
doing sys.getsizeof(mydict) is not exact so, pickle it first
now we can know how much space it takes by
Use this recipe , taken from here:
http://code.activestate.com/recipes/577504-compute-memory-footprint-of-an-object-and-its-cont/
None of the answers here are truly generic. The following solution will work with any type of object recursively, without the need for an expensive recursive implementation:
For example:
See my repository for more information, or simply install my package (objsize):
Then: