Here is a pseudocode for Horner's method for computing the value of a polynomial at x (where a[i] denotes the coefficient of x^i) :
y=a[0]
for i = n to 1
y = a[i] + x*y
Many articles on the internet state that the running time of Horner's method is proportional to n.
But since the number of terms in y is proportional to (n-i) (when we have already completed i iterations), shouldn't the total time taken be (n-1)+(n-2)....1 which is proportional to n^2 ?
Or do we always consider that any multiplication (irrespective of the number of terms), takes constant time?
You're trying to compute the value of the polynomial so y
in your code is a number not another polynomial. You had x
defined somewhere before to the value at which you want the polynomial's value computed.
Yes, we assume multiplication of two numbers is constant operation. So in each iteration you do one multiplication and one addition, both operations between two integers (possibly floating numbers, depends on what field you're operating) and one assignment, all of these are constant. You do these three operations n
times, yielding total of O(n)
.
Also I believe you should be either putting a[n]
in y
in the first step and iterate from n-1
to 0
or reverse the iteration of the for cycle, depending on your representation of polynomial.