So i was playing with list
objects and found little strange thing that if list
is created with list()
it uses more memory, than list comprehension? I'm using Python 3.5.2
In [1]: import sys
In [2]: a = list(range(100))
In [3]: sys.getsizeof(a)
Out[3]: 1008
In [4]: b = [i for i in range(100)]
In [5]: sys.getsizeof(b)
Out[5]: 912
In [6]: type(a) == type(b)
Out[6]: True
In [7]: a == b
Out[7]: True
In [8]: sys.getsizeof(list(b))
Out[8]: 1008
From the docs:
Lists may be constructed in several ways:
- Using a pair of square brackets to denote the empty list:
[]
- Using square brackets, separating items with commas:
[a]
,[a, b, c]
- Using a list comprehension:
[x for x in iterable]
- Using the type constructor:
list()
orlist(iterable)
But it seems that using list()
it uses more memory.
And as much list
is bigger, the gap increases.
Why this happens?
UPDATE #1
Test with Python 3.6.0b2:
Python 3.6.0b2 (default, Oct 11 2016, 11:52:53)
[GCC 5.4.0 20160609] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import sys
>>> sys.getsizeof(list(range(100)))
1008
>>> sys.getsizeof([i for i in range(100)])
912
UPDATE #2
Test with Python 2.7.12:
Python 2.7.12 (default, Jul 1 2016, 15:12:24)
[GCC 5.4.0 20160609] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import sys
>>> sys.getsizeof(list(xrange(100)))
1016
>>> sys.getsizeof([i for i in xrange(100)])
920
Thanks everyone for helping me to understand that awesome Python.
I don't want to make question that massive(that why i'm posting answer), just want to show and share my thoughts.
As @ReutSharabani noted correctly: "list() deterministically determines list size". You can see it from that graph.
When you
append
or using list comprehension you always have some sort of boundaries that extends when you reach some point. And withlist()
you have almost the same boundaries, but they are floating.UPDATE
So thanks to @ReutSharabani, @tavo, @SvenFestersen
To sum up:
list()
preallocates memory depend on list size, list comprehension cannot do that(it requests more memory when it needed, like.append()
). That's whylist()
store more memory.One more graph, that show
list()
preallocate memory. So green line showslist(range(830))
appending element by element and for a while memory not changing.UPDATE 2
As @Barmar noted in comments below,
list()
must me faster than list comprehension, so i rantimeit()
withnumber=1000
for length oflist
from4**0
to4**10
and the results areI think you're seeing over-allocation patterns this is a sample from the source:
Printing the sizes of list comprehensions of lengths 0-88 you can see the pattern matches:
Results (format is
(list length, (old total size, new total size))
):The over-allocation is done for performance reasons allowing lists to grow without allocating more memory with every growth (better amortized performance).
A probable reason for the difference with using list comprehension, is that list comprehension can not deterministically calculate the size of the generated list, but
list()
can. This means comprehensions will continuously grow the list as it fills it using over-allocation until finally filling it.It is possible that is will not grow the over-allocation buffer with unused allocated nodes once its done (in fact, in most cases it wont, that would defeat the over-allocation purpose).
list()
, however, can add some buffer no matter the list size since it knows the final list size in advance.Another backing evidence, also from the source, is that we see list comprehensions invoking
LIST_APPEND
, which indicates usage oflist.resize
, which in turn indicates consuming the pre-allocation buffer without knowing how much of it will be filled. This is consistent with the behavior you're seeing.To conclude,
list()
will pre-allocate more nodes as a function of the list sizeList comprehension does not know the list size so it uses append operations as it grows, depleting the pre-allocation buffer: