How would i fit a straight line and a quadratic to the data set below using the leastsq function from scipy.optimize? I know how to use polyfit to do it. But i need to use leastsq function.
Here are the x and y data sets:
x: 1.0,2.5,3.5,4.0,1.1,1.8,2.2,3.7
y: 6.008,15.722,27.130,33.772,5.257,9.549,11.098,28.828
Can someone help me out please?
The leastsq() method finds the set of parameters that minimize the error function ( difference between yExperimental and yFit).
I used a tuple to pass the parameters and lambda functions for the linear and quadratic fits.
leastsq starts from a first guess ( initial Tuple of parameters) and tries to minimize the error function. At the end, if leastsq succeeds, it returns the list of parameters that best fit the data. ( I printed to see it).
I hope it works
best regards
from scipy.optimize import leastsq
import numpy as np
import matplotlib.pyplot as plt
def main():
# data provided
x=np.array([1.0,2.5,3.5,4.0,1.1,1.8,2.2,3.7])
y=np.array([6.008,15.722,27.130,33.772,5.257,9.549,11.098,28.828])
# here, create lambda functions for Line, Quadratic fit
# tpl is a tuple that contains the parameters of the fit
funcLine=lambda tpl,x : tpl[0]*x+tpl[1]
funcQuad=lambda tpl,x : tpl[0]*x**2+tpl[1]*x+tpl[2]
# func is going to be a placeholder for funcLine,funcQuad or whatever
# function we would like to fit
func=funcLine
# ErrorFunc is the diference between the func and the y "experimental" data
ErrorFunc=lambda tpl,x,y: func(tpl,x)-y
#tplInitial contains the "first guess" of the parameters
tplInitial1=(1.0,2.0)
# leastsq finds the set of parameters in the tuple tpl that minimizes
# ErrorFunc=yfit-yExperimental
tplFinal1,success=leastsq(ErrorFunc,tplInitial1[:],args=(x,y))
print " linear fit ",tplFinal1
xx1=np.linspace(x.min(),x.max(),50)
yy1=func(tplFinal1,xx1)
#------------------------------------------------
# now the quadratic fit
#-------------------------------------------------
func=funcQuad
tplInitial2=(1.0,2.0,3.0)
tplFinal2,success=leastsq(ErrorFunc,tplInitial2[:],args=(x,y))
print "quadratic fit" ,tplFinal2
xx2=xx1
yy2=func(tplFinal2,xx2)
plt.plot(xx1,yy1,'r-',x,y,'bo',xx2,yy2,'g-')
plt.show()
if __name__=="__main__":
main()
from scipy.optimize import leastsq
import scipy as sc
import numpy as np
import matplotlib.pyplot as plt
with optimize.curve_fit the code is simpler, there is no need to define the residual(error) function.
fig, ax = plt.subplots ()
# data
x=np.array([1.0,2.5,3.5,4.0,1.1,1.8,2.2,3.7])
y=np.array([6.008,15.722,27.130,33.772,5.257,9.549,11.098,28.828])
# modeling functions
def funcLine(x, a,b):
return a*x+b
def funcQuad(x, a, b, c):
return a*x**2+b*x+c
# optimize constants for the linear function
constantsLine, _ = sc.optimize.curve_fit (funcLine, x, y)
X=np.linspace(x.min(),x.max(),50)
Y1=funcLine(X, *constantsLine)
# optimize constants for the quadratic function
constantsQuad, _ = sc.optimize.curve_fit (funcQuad, x, y)
Y2=funcQuad(X,*constantsQuad)
plt.plot(X,Y1,'r-',label='linear approximation')
plt.plot(x,y,'bo',label='data points')
plt.plot(X,Y2,'g-', label='quadratic approximation')
matplotlib.pylab.legend ()
ax.set_title("Nonlinear Least Square Problems", fontsize=18)
plt.show()
Here's a super simple example. Picture a paraboloid, so like a bowl with sides growing like a parabola. If we put the bottom at coordinates (x, y) = (a, b) and then minimize the height of the paraboloid over all values of x and y - we would expect the minimum to be x=a and y=b. Here's code that would do this.
import random
from scipy.optimize import least_squares
a, b = random.randint(1, 1000), random.randint(1, 1000)
print("Expect", [a, b])
def f(args):
x, y = args
return (x-a)**2 + (y-b)**2
x0 = [-1, -3]
result = least_squares(fun=f, x0=x0)
print(result.x)