Is there an alternative to the fminunc
function (from octave/matlab) in python? I have a cost function for a binary classifier. Now I want to run gradient descent to get minimum value of theta. The octave/matlab implementation will look like this.
% Set options for fminunc
options = optimset('GradObj', 'on', 'MaxIter', 400);
% Run fminunc to obtain the optimal theta
% This function will return theta and the cost
[theta, cost] = ...
fminunc(@(t)(costFunction(t, X, y)), initial_theta, options);
I have converted my costFunction in python using numpy library, and looking for the fminunc or any other gradient descent algorithm implementation in numpy.
I was also trying to implement logistic regression as discussed in Coursera ML course, but in python. I found scipy helpful. After trying different algorithm implementations in minimize function, I found Newton Conjugate Gradient as most helpful. Also After examining its returned value, it seems that it is equivalent to that of fminunc in Octave. I have included my implementation in python below find to optimal theta.
Looks like you have to change to
scipy
.There you find all basic optimization algorithms readily implemented.
http://docs.scipy.org/doc/scipy/reference/optimize.html
There is more information about the functions of interest here: http://docs.scipy.org/doc/scipy-0.10.0/reference/tutorial/optimize.html
Also, it looks like you are doing the Coursera Machine Learning course, but in Python. You might check out http://aimotion.blogspot.com/2011/11/machine-learning-with-python-logistic.html; this guy's doing the same thing.
Implemented as below and getting similar result of octiva:
output: Executing minimize function...
Thetas found by fmin_tnc function: (array([-25.16131854, 0.20623159, 0.20147149]), 36, 0) Cost at theta found : 0.218330193827 Expected cost (approx): 0.203
theta: [-25.16131854 0.20623159 0.20147149] Expected theta (approx):
-25.161 0.206 0.201