This question already has an answer here:
Is there a simple way to flatten a list of iterables with a list comprehension, or failing that, what would you all consider to be the best way to flatten a shallow list like this, balancing performance and readability?
I tried to flatten such a list with a nested list comprehension, like this:
[image for image in menuitem for menuitem in list_of_menuitems]
But I get in trouble of the NameError
variety there, because the name 'menuitem' is not defined
. After googling and looking around on Stack Overflow, I got the desired results with a reduce
statement:
reduce(list.__add__, map(lambda x: list(x), list_of_menuitems))
But this method is fairly unreadable because I need that list(x)
call there because x is a Django QuerySet
object.
Conclusion:
Thanks to everyone who contributed to this question. Here is a summary of what I learned. I'm also making this a community wiki in case others want to add to or correct these observations.
My original reduce statement is redundant and is better written this way:
>>> reduce(list.__add__, (list(mi) for mi in list_of_menuitems))
This is the correct syntax for a nested list comprehension (Brilliant summary dF!):
>>> [image for mi in list_of_menuitems for image in mi]
But neither of these methods are as efficient as using itertools.chain
:
>>> from itertools import chain
>>> list(chain(*list_of_menuitems))
And as @cdleary notes, it's probably better style to avoid * operator magic by using chain.from_iterable
like so:
>>> chain = itertools.chain.from_iterable([[1,2],[3],[5,89],[],[6]])
>>> print(list(chain))
>>> [1, 2, 3, 5, 89, 6]
This solution works for arbitrary nesting depths - not just the "list of lists" depth that some (all?) of the other solutions are limited to:
It's the recursion which allows for arbitrary depth nesting - until you hit the maximum recursion depth, of course...
There seems to be a confusion with
operator.add
! When you add two lists together, the correct term for that isconcat
, not add.operator.concat
is what you need to use.If you're thinking functional, it is as easy as this::
You see reduce respects the sequence type, so when you supply a tuple, you get back a tuple. let's try with a list::
Aha, you get back a list.
How about performance::
from_iterable is pretty fast! But it's no comparison to reduce with concat.
From my experience, the most efficient way to flatten a list of lists is:
Some timeit comparisons with the other proposed methods:
Now, the efficiency gain appears better when processing longer sublists:
And this methods also works with any iterative object:
Test:
Performance Results. Revised.
I flattened a 2-level list of 30 items 1000 times
Reduce is always a poor choice.
If you're looking for a built-in, simple, one-liner you can use:
returns