I am trying to understand how the "dogleg" method works in Python's scipy.optimize.minimize
function. I am adapting the example at the bottom of the help page.
The dogleg method requires a Jacobian and Hessian argument according to the notes. For this I use the numdifftools
package:
import numpy as np
from scipy.optimize import minimize
from numdifftools import Jacobian, Hessian
def fun(x,a):
return (x[0] - 1)**2 + (x[1] - a)**2
x0 = np.array([2,0]) # initial guess
a = 2.5
res = minimize(fun, x0, args=(a), method='dogleg',
jac=Jacobian(fun)([2,0]), hess=Hessian(fun)([2,0]))
print(res)
Edit:
If I make a change as suggested by a post below,
res = minimize(fun, x0, args=a, method='dogleg',
jac=Jacobian(lambda x: fun(x,a)),
hess=Hessian(lambda x: fun(x,a)))
I get an error TypeError: <lambda>() takes 1 positional argument but 2 were given
. What am I doing wrong?
Also is it correct to calculate the Jacobian and Hessian at the initial guess x0
?
That error is coming from the calls to
Jacobian
andHessian
, not inminimize
. ReplacingJacobian(fun)
withJacobian(lambda x: fun(x, a))
and similarly forHessian
should do the trick (since now the function being differentiated only has a single vector argument).One other thing:
(a)
is justa
, if you want it to be a tuple use(a,)
.I get that this is a toy example, but I would like to point out that using a tool like
Jacobian
orHessian
to calculate the derivatives instead of deriving the function itself is fairly costly. For example with your method:But you could calculate the derivative functions as such:
As you can see that is almost 50x faster. It really starts to add up with complex functions. As such I always try to derive the functions explicitly myself, regardless of how difficult that may be. One fun example is the kernel based implementation of Inductive Matrix Completion.
Calculating the gradient and hessian from this equation is extremely unreasonable in comparison to explicitly deriving and utilizing those functions. So as @bnaul pointed out, if your function does have closed form derivates you really do want to calculate and use them.