I have gotten into an argument/debate recently and I am trying to get a clear verdict of the correct solution.
It is well known that n!
grows very quickly, but exactly how quickly, enough to "hide" all additional constants that might be added to it?
Let's assume I have this silly & simple program (no particular language):
for i from 0 to n! do:
; // nothing
Given that the input is n
, then the complexity of this is obviously O(n!)
(or even ϴ(n!)
but this isn't relevant here).
Now let's assume this program:
for i from 0 to n do:
for j from 0 to n do:
for k from 0 to n! do:
; // nothing
Bob claims: "This program's complexity is obviously O(n)O(n)O(n!) = O(n!n^2) = O((n+2)!)
."
Alice responds: "I agree with you bob, but actually it would be sufficient if you said that the complexity is O(n!)
since O(n!n^k) = O(n!)
for any k >= 1
constant."
Is Alice right in her note of Bob's analysis?
Amit solution is perfect, I would only add more "human" solution, because understanding definition can be difficult for beginners.
The definition basically says - if you are increasing the value
n
and the methodsf(n)
andg(n)
differs "only"k
-times, wherek
is constant and does not change (for exampleg(n)
is always ~100times higher, no matter ifn=10000
orn=1000000
), then these functions have same complexity.If the
g(n)
is 100times higher forn=10000
and 80times higher forn=1000000
, thenf(n)
has higher complexity! Because as then
grows and grows, thef(n)
would eventually at some point reach theg(n)
and then it will grow more and more compare tog(n)
. In complexity theories, you are interested in, how it will end in "infinity" (or more imaginable extremely HIGH values of n).if you compare
n!
andn!*n^2
, you can see, that forn=10
, the second function has10^2=100
times higher value. Forn=1000
, it has1000^2=1000000
times higher value. And as you can imagine, the difference will grow.Alice is wrong, and Bob is right.
Recall an equivalent definition to big O notation when using limit:
For any
k>0
:And thus,
n!*n^k
is NOT inO(n!)