Assuming some algorithm has a polynomial time complexity T(n), is it possible for any of the terms to have a negative coefficient? Intuitively, the answer seems like an obvious "No" since there is no part of any algorithm that reduces the existing amount of time taken by previous steps but I want to be certain.
相关问题
- Calculating the complexity of an algorithm with 3
- Finding the minimum in an unsorted array in logari
- Complexity of STL max_element
- Is the complexity of unordered_set::find predictab
- Shuffling a sorted array
相关文章
- What is the complexity of bisect algorithm?
- How to measure complexity of a string?
- Searching strings with . wildcard
- why O(2n^2) and O(100 n^2) same as O(n^2) in algor
- Merge sort worst case running time for lexicograph
- Sorting nearly sorted array with O(n*Log(K)) compl
- GCD algorithms for a large integers
- 3D Connected Points Labeling based on Euclidean di
It is possible for an algorithm to have a negative coefficient in its time complexity, but overall the algorithm will have some positive time complexity. As an example from Wikipedia, take the function
f(x)=6x^4-2x^3+5
. They solve for the complexity ofO(x^4)
as follows:That is, even if there are negative coefficients in the original equation, there is still some positive overall time complexity based on the term with the highest order of power.
What about for lower bounds? By definition, we can find the lower bound of any function by using the following definition: As
n
goes to infinity, then for some constantk
and somen0
we have that the following holds for alln>n0
:Let's guess that the above function f(x) is also Omega(x^4). This means that:
Solving for k:
The term (2/x) approaches 0, as does (5/x^4) so we can choose
k=2
for some large x0=30. To show that this holds, we show that:Which holds. So
f(x)
is Omega(x^4), and we can also conclude that we have found a tight bound such thatf(x)
is Theta(x^4).Why does this work, even though the coefficient was negative? For both Big O and Big Omega notation, we are looking for a bound such that after some point one function dominates another. That is, as these graphs illustrate:
Thinking about our original f(x),
6x^4
grows faster than2x^4
(our kg(x) function). After some point, the6x^4
term will outstrip the growth of2x^4
in such a way that it is always greater than2x^4
. Graphically, the two functions look like this:http://www3.wolframalpha.com/Calculate/MSP/MSP8991a0763a90cg9c64500002i49d6b33c5684hg?MSPStoreType=image/gif&s=47&w=319&h=132&cdf=RangeControl
Despite the negative coefficient, clearly
kg(x)
is a lower bound off(x)
.Now, is this always true for any polynomial function with any negative coefficient--that a function
f(x)
with any coefficients will be bound by its highest degree polynomials? No. If the term with the highest degree has the negative coefficient, then the bounds aren't quite the same. Takef(x) = -2x^2
. We can show thatf(x) = O(x^2)
:Which can be satisfied by any
c>0
(asc
is by definition a positive constant). However, if we try to do the same for lower bound:Then we can't find the right
c
because againc
must be non-negative.When talking about polynomial complexity, only the coefficient with the highest degree counts.
But I think you can have T(n) = n*n - n = n*(n-1). The n-1 would represent something you don't do on the first or last iteration.
Anyway, the complexity would still be n*n.