Following example was taken from Cracking the coding interview (version 6) book. As per the book the time complexity of the following code is O(n^2 * n!). (Please refer the example 12. Page 32,33)
public static void main(String[] args) {
new PermutationsTest().permutations("ABCD");
}
void permutations(String string) {
permutations(string, "");
}
static int methodCallCount = 0;
void permutations(String string, String perfix) {
if (string.length() == 0) {
System.out.println(perfix);
} else {
for (int i = 0; i < string.length(); i++) {
String rem = string.substring(0, i) + string.substring(i + 1);
permutations(rem, perfix + string.charAt(i));
}
}
System.out.format("Method call count %s \n", methodCallCount);
methodCallCount += 1;
}
I am finding it difficult to understand how it was calculated. Following is my thoughts about it.
There can be n! arrangements. So there should be at least n! calls. However, for each call, roughly n times work happens. (as it need to loop through the passed string). So shouldn't the answer be O (n * n!)?
But what really happen is for each call the looping need to be done for (n-1) strings. So can we say it should be rather n! * n(n+1)/2
Pls explain..
There are n!
possible strings, but each character that's added to the string requires:
String rem = string.substring(0, i) + string.substring(i + 1);
permutations(rem, perfix + string.charAt(i));
The substring
calls and the string concatenation are O(n)
. For each character in a string that would be O(n^2)
and for all strings would be O(n^2 * n!)
To get the asymptotic time complexity, you need to count how many time the permutations
function is called and what is its asymptotic time complexity. The answer is product of these.
The string.length() = len
decreases always with 1 in each iteration, so there is 1
call for len=n
, n calls for len=n-1
, n*(n-1)
calls for len = n-2
, ... , n!
calls for len = 0. Hence, the total number of calls is:
n!/1! + n!/2! + n!/3! + n!/4! + .. + n!/n! = sum(k=1..n) n!/k!
In asymptotic limit this can be calculated:
sum(k=1..n)( n!/k! ) = n! (-1 + sum(k=0..n) 1/k! (1^k)) -> n! (e^1 - 1) = (e-1)*n!,
which is O((1-e)*n!) = O(n!). e
is the Napier constant 2.71828128.. .To calculate the sum I used the Taylor series e^x = sum(k=0..infinity) 1/k! x^k
atx=1
.
Now for each call of the function there is the substring and concatenation operations:
String rem = string.substring(0, i) + string.substring(i + 1);
This operation requires order of string.length
operations as under the hood the String class
needs to copy each character to a new String ( String.length - 1
number of operations). Therefore, the total complexity is the product of these two O(n*n!)
.
To check that the calls to perm behave as I said, I wrote a small c++
code for permutations (without the string operations so it should be O(n!) )`.
#include <iostream>
#include <string>
#include <iomanip>
unsigned long long int permutations = 0;
unsigned long long int calls = 0;
void perm(std::string& str, size_t index){
++calls;
if (index == str.size()-1){
++permutations;
// std::cout << count << " " << str << std::endl;
}
else{
for (size_t i=index; i < str.size();++i){
std::swap(str[index],str[i]);
perm(str,index+1);
std::swap(str[index],str[i]);
}
}
}
int main(){
std::string alpha="abcdefghijklmnopqrstuvwxyz";
std::cout << std::setprecision(10);
for (size_t i=1;i<alpha.size()+1;++i){
permutations = calls = 0;
std::string str(alpha.begin(),alpha.begin()+i);
perm(str,0);
std::cout << i << " " << permutations << " " << calls << " " << static_cast<double>(calls)/permutations << std::endl;
}
}
Output:
1 1 1 1
2 2 3 1.5
3 6 10 1.666666667
4 24 41 1.708333333
5 120 206 1.716666667
6 720 1237 1.718055556
7 5040 8660 1.718253968
8 40320 69281 1.71827877
9 362880 623530 1.718281526
10 3628800 6235301 1.718281801
11 39916800 68588312 1.718281826
12 479001600 823059745 1.718281828
13 6227020800 10699776686 1.718281828
14 took too long
The columns are: length of the string = n
, n!
, sum(k=1..n) n!/k!
, ratio of third and second column, which should be (e-1)=1.71828182845905
. So it seems to converge rather fast to the asymptotic limit.