I am pretty familiar with algorithm analysis and can tell Big-O about most of the algorithms I work with. But I am stuck for hours unable to come up with Big-O for this code that I am writing.
This is basically a method of generating permutations for a string. It works by making each character in the string the first character and combines it with permutations of the substring less than that character (recursively).
If I put a code to count the number of iterations, I have something between O (N!) And O (N ^ N). But I could not understand how to analyze it mentally. Any suggestion is much appreciated!
import java.util.ArrayList; import java.util.List; public class Permutation { int count = 0; List<String> findPermutations(String str) { List<String> permutations = new ArrayList<String>(); if (str.length() <= 1) { count++; permutations.add(str); return permutations; } for (int i = 0; i < str.length(); i++) { String sub = str.substring(0, i) + str.substring(i + 1); for (String permOfSub : findPermutations(sub)) { count++; permutations.add(str.charAt(i) + permOfSub); } } return permutations; } public static void main(String[] args) { for (String s : new String[] {"a", "ab", "abc", "abcd", "abcde", "abcdef", "abcdefg", "abcdefgh"}) { Permutation p = new Permutation(); p.findPermutations(s); System.out.printf("Count %d vs N! %d%n", p.count, fact(s.length())); } } private static int fact(int i) { return i <= 1 ? i : i * fact(i-1); } }
Edit 1: add test program
Edit 2: add count++
to base register
source share