Let's suppose that a recursive formula is a big-o(n^2), and at the same time a big-omega(n^2). Does this imply that the recursion is a big-Theta(n^2)?
相关问题
- Adjacency list with O(1) look up time using HashSe
- What is the big-O complexity of this algorithm?
- Finding the minimum in an unsorted array in logari
- Solving the recurrence T(n) = 2T(sqrt(n))
- Shuffling a sorted array
相关文章
- Is there any function that is in o(1)?
- Searching strings with . wildcard
- Solving master theorem with log n: T(n) = 2T(n/4)
- Complexity of finding all simple paths using depth
- Time Complexity of Permutations of a String
- Find two numbers in a binary search tree that add
- Why is this function/loop O(log n) and not O(n)?
- Simplifying the Big-O Complexity of this Exponenti
To make the long story short: the answer is Yes, it does. See proof below.
Though everybody have heard about big-o notation lets recall what exactly does these notations mean with a help of Introduction to Algorithms. For a general case it is said Ο(g(n)), Ω(g(n)), Θ(g(n)), but we will consider yours.
Ο(n2)
Ο(n2) notation defines a set of functions for each of which the following statement holds: There exists such positive constants c and n0 that 0 ≤ f(n) ≤ cn2 holds for all n ≥ n0.
So f(n) is just a function from Ο(n2). Examples 13n, -5, 4n2 + 5. All these pertain to Ο(n2).
Ω(n2)
Ω(n2) notation defines a set of functions for each of which the following statement holds: There exists such positive constants c and n0 that 0 ≤ cn2 ≤ f(n) holds for all n ≥ n0.
So f(n) is just a function from Ω(n2). Examples n4 + n - 1, 3n, n2 - 12. All these pertain to Ω(n2).
Θ(n2)
Θ(n2) notation defines a set of functions for each of which the following statement holds: There exists such positive constants c1, c2 and n0 that 0 ≤ c1n2 ≤ f(n) ≤ c2n2 holds for all n ≥ n0.
Again f(n) is just a function from Θ(n2). These are its representatives n2/2 + 3, 5n2.
Proof
I bet saying that a recursive formula is a big-o(n^2), and at the same time a big-omega(n^2) you meant there is a function (lets call it) f(n) pertaining to Ω(n2) and Ο(n2).
From Ω(n2) we have existence of c1 that c1n2 ≤ f(n) holds. From Ο(n2) we have existence of c2 that f(n) ≤ c2n2 holds. Consequently we have existence of c1 and c2 that c1n2 ≤ f(n) ≤ c2n2, that is exactly what Θ(n2) is about.