itertools.combinations
in python is a powerful tool for finding all combination of r terms, however, I want to know about its computational complexity.
Let's say I want to know the complexity in terms of n and r, and certainly it will give me all the r terms combination from a list of n terms.
According to the Official document, this is the rough implementation.
def combinations(iterable, r):
# combinations('ABCD', 2) --> AB AC AD BC BD CD
# combinations(range(4), 3) --> 012 013 023 123
pool = tuple(iterable)
n = len(pool)
if r > n:
return
indices = list(range(r))
yield tuple(pool[i] for i in indices)
while True:
for i in reversed(range(r)):
if indices[i] != i + n - r:
break
else:
return
indices[i] += 1
for j in range(i+1, r):
indices[j] = indices[j-1] + 1
yield tuple(pool[i] for i in indices)
I had this same question too and had a hard time tracing the complexities. This led me to visualize the code using matplotlib.pyplot;
The code snippet is shown below
Time Complexity graph for itertools.permutation
From the graph, it is observed that the time complexity is O(n!) where n=Input Size
I would say it is
θ[r (n choose r)]
, then choose r
part is the number of times the generator has toyield
and also the number of times the outerwhile
iterates.In each iteration at least the output tuple of length
r
needs to be generated, which gives the additional factorr
. The other inner loops will beO(r)
per outer iteration as well.This is assuming that the tuple generation is actually
O(r)
and that the list get/set are indeedO(1)
at least on average given the particular access pattern in the algorithm. If this is not the case, then stillΩ[r (n choose r)]
though.As usual in this kind of analysis I assumed all integer operations to be
O(1)
even if their size is not bounded.