Say I have a function func(i) that creates an object for an integer i, and N is some nonnegative integer. Then what's the fastest way to create a list (not a range) equal to this list
mylist = [func(i) for i in range(N)]
without resorting to advanced methods like creating a function in C? My main concern with the above list comprehension is that I'm not sure if python knows beforehand the length of range(N) to preallocate mylist, and therefore has to incrementally reallocate the list. Is that the case or is python clever enough to allocate mylist to length N first and then compute it's elements? If not, what's the best way to create mylist? Maybe this?
mylist = [None]*N
for i in range(N): mylist[i] = func(i)
RE-EDIT: Removed misleading information from a previous edit.
Using list comprehension to accomplish what you're trying to do would be more pythonic way to do it. Despite performance penalty:).
Interesting question. As of the following test, it seems that preallocation does not improve performance in the current CPython implementation (Python 2 code but result ranking is the same, except that there's no xrange in Python 3):
Results (ranking in parentheses):
but with
psyco.full()
:As you can see, pseudo-preallocation is faster with psyco. In any case, there's not much of a difference between the
xrange
solution (which I'd recommend) and the other solutions. If you don't process all elements of the list later, you could also use the lazy method (shown in the code above) which will create a generator that produces elements by the time you iterate over it.Somebody wrote: """Python is smart enough. As long as the object you're iterating over has a
__len__
or__length_hint__
method, Python will call it to determine the size and preallocate the array."""As far as I can tell, there is no preallocation in a list comprehension. Python has no way of telling from the size of the INPUT what the size of the OUTPUT will be.
Look at this Python 2.6 code:
It just builds an empty list, and appends whatever the iteration delivers.
Now look at this code, which has an 'if' in the list comprehension:
The same code, plus some code to avoid the LIST_APPEND.
In Python 3.X, you need to dig a little deeper:
It's the same old schtick: start off with building an empty list, then iterate over the iterable, appending to the list as required. I see no preallocation here.
The optimisation that you are thinking about is used inside a single opcode e.g. the implementation of
list.extend(iterable)
can preallocate ifiterable
can accurately report its length.list.append(object)
is given a single object, not an iterable.If you use the
timeit
module, you may come to the opposite conclusion: list comprehension is faster than preallocation:Warning: These are the results on my computer. You should try these tests yourself, as your results may be different depending on your version of Python and your hardware. (See the comments.)
Note that unlike Javier's answer, I include
mylist = [None]*N
as part of the code timeit is to time when using the "pre-allocation" method. (It's not just part of the setup, since it is code one would need only if using pre-allocation.)PS. the
time
module (andtime
(unix) command) can give unreliable results. If you wish to time Python code, I'd suggest sticking with thetimeit
module.Going to have to disagree with Javier here...
With the following code:
I get the following table:
From this table it appears that the list comprehension faster than pre-allocation in every case of these varying lengths.
There is no difference in computational complexity between using an autoresizing array and preallocating an array. At worst, it costs about O(2N). See here:
Constant Amortized Time
The cost of the function calls plus whatever happens in your function is going to make this extra n insignificant. This isn't something you should worry about. Just use the list comprehension.