I have a few (almost ten) Gb of memory taken by the ipython kernel. I think this is coming from large objects (matrices, lists, numpy arrays, ...) that I might have produced during some operation and now I do not need anymore.
I would like to list all of the objects I have defined and sort them by their memory footprint. Is there a simple way to do that? For certain types there is nbytes
method, but not for all ... so I am looking for a general way to list all objects I have made and their memory occupation.
Assuming that you are using
ipython
orjupyter
, you will need to do a little bit of work to get a list all of the objects you have defined. That means taking everything available inglobals()
and filtering out objects that aremodules
,builtins
,ipython objects
, etc. Once you are sure you have those objects, then you can proceed to grabbing their sizes withsys.getsizeof
. This can be summed up as follows:Please keep in mind that for python objects (those created with python's builtin functions),
sys.getsizeof
will be very accurate. But it can be a bit inaccurate on objects created using third-party libraries. Furthermore, please be mindful thatsys.getsizeof
adds an additional garbage collector overhead if the object is managed by the garbage collector. So, some things may look a bit heavier than they actually are.As a side note,
numpy
's.nbytes
method can be somewhat misleading in that it does not include memory consumed by non-element attributes of the array object.I hope this helps.