Here is a code to compute a base version of a decimal number. I am not sure of its time complexity.
Thanks,
public static String convertToBase(int num, int base) {
if (base > 36) {
throw new IllegalArgumentException("The input argument should be less than or equal to 36.");
}
final StringBuilder sb = new StringBuilder();
while (num > 0) {
final int div = num % base;
if (div > 9) {
final int sum = div + 55;
sb.append((char) sum);
} else {
sb.append(div);
}
num = num / base;
}
return sb.reverse().toString();
}
Complexity of the algorithm is
IF Base=36, num=35.............number of iterations is 1
IF Base=02, num=35.............number of iterations is 6
Here number Iterations is = Log of (num) with base of log = base.
Hence time complexity is big-O(Log(n))
Strictly speaking, the answer is O(1)
.
If int
was an integer type that supported arbitrary precision, then clearly the answer would be O(logN)
.
But it is not! An int
can get no larger than Integer.MAX_INT
with is 2^31 - 1 ... or roughly 2 billion.
So, if we let N
(the unbounded integer) tend to infinity, the value of num
wraps around so that it never exceeds Integer.MAX_INT
. That means that if (for example) base
is 10
, the while
loop can execute at most log10(2^31)
times (i.e. 10 times) ... and convertToBase
is O(1)
.
However, if you are prepared to abuse the terminology / notation, you could say that it is O(logN)
for small enough N
.
As Stephen C said, complexity is O(1) since n is bounded.
If n was a arbitrary precision number, complexity would be
O(M(n) log(n))
where M(n) is complexity of multiplication (since there are "num / base" operation).
See
http://en.wikipedia.org/wiki/Computational_complexity_of_mathematical_operations#Arithmetic_functions