I'm fairly familiar with algorithm analysis and can tell the Big-O of most algorithms I work with. But I've been stuck for hours unable to come up with the Big-O for this code I write.
Basically it's a method to generate permutations for a string. It works by making each character in the string the first character and combine it with the permutations of the substring less that character (recursively).
If I put in the code to count the number of iterations, I've got something between O(N!) and O(N^N). But I couldn't figure out how to analyse it mentally. Any suggestion is much appreciated!
import java.util.ArrayList;
import java.util.List;
public class Permutation {
int count = 0;
List<String> findPermutations(String str) {
List<String> permutations = new ArrayList<String>();
if (str.length() <= 1) {
count++;
permutations.add(str);
return permutations;
}
for (int i = 0; i < str.length(); i++) {
String sub = str.substring(0, i) + str.substring(i + 1);
for (String permOfSub : findPermutations(sub)) {
count++;
permutations.add(str.charAt(i) + permOfSub);
}
}
return permutations;
}
public static void main(String[] args) {
for (String s : new String[] {"a", "ab", "abc", "abcd", "abcde", "abcdef", "abcdefg", "abcdefgh"}) {
Permutation p = new Permutation();
p.findPermutations(s);
System.out.printf("Count %d vs N! %d%n", p.count, fact(s.length()));
}
}
private static int fact(int i) {
return i <= 1 ? i : i * fact(i-1);
}
}
Edit 1: add test program
Edit 2: add count++
in base case