When we say a method has the time complexity of O(n^2)
is it meant in the same way as in 10^2 = 100
or does it mean that the method is at max or closest to that notation? I am really confused on how to undertand Big O. I remember something called upper bound, would that mean at max?
相关问题
- Finding k smallest elements in a min heap - worst-
- binary search tree path list
- High cost encryption but less cost decryption
- How to get a fixed number of evenly spaced points
- Space complexity of validation of a binary search
相关文章
- What are the problems associated to Best First Sea
- Coin change DP solution to keep track of coins
- Algorithm for partially filling a polygonal mesh
- Robust polygon normal calculation
- Algorithm for maximizing coverage of rectangular a
- How to measure complexity of a string?
- Select unique/deduplication in SSE/AVX
- How to smooth the blocks of a 3D voxel world?
Explanation
If a method
f
is insideO(g)
, withg
being another function, it means that at some point (exist somen_0
such that for alln > n_0
) the functionf
will always output a smaller value thang
for that point. However,g
is allowed to have an arbitrary constantk
. Sof(n) <= k * g(n)
for alln
above somen_0
. Sof
is allowed to first be bigger, if it then starts to be smaller and keeps being smaller.We say
f
is asymptotically bounded byg
. Asymptotically means that we do not care howf
behaves in the beginning. Only what it will do when approaching infinity. So we discard all inputs belown_0
.Illustration
An illustration would be this:
The blue function is
k * g
with some constantk
, the red one isf
. We see thatf
is greater at first, but then, starting atx_0
, it will always be smaller thank * g
. Thusf in O(g)
.Definition
Mathematically, this can be expressed by
which is the usual definition of Big-O. From the explanation above, the definition should be clear. It says that from a certain
n_0
on, the functionf
must be smaller thank * g
for all inputs.k
is allowed to be some constant.Both images are taken from Wikipedia.
Examples
Here are a couple of examples to familiarize with the definition:
n
is inO(n)
(trivially)n
is inO(n^2)
(trivially)5n
is inO(n^2)
(starting fromn_0 = 5
)25n^2
is inO(n^2)
(takingk = 25
or greater)2n^2 + 4n + 6
is inO(n^2)
(takek = 3
, starting fromn_0 = 5
)Notes
Actually,
O(g)
is a set in the mathematical sense. It contains all functions with the above mentioned property (which are asymptotically bounded byg
).So, although some authors write
f = O(g)
, it is actually wrong and should bef in O(g)
.There are also other, similar, sets, which only differ in the direction of the bound:
<=
<
>=
>
If means that the running time is bounded above by N².
More precisely, T(N) < C.N², where C is some constant, and the inequality is true as of a certain N*.
For example, 2N²+4N+6 = O(N²), because 2N²+4N+6 < 3N² for all N>5.