My initial question can be found here:Optimization in R with arbitrary constraints
It led to another question how to pass arguments into nloptr
.
I need to minimize a function F(x,y,A)
where x
and y are vectors and A
is a matrix, while having constrains that sum(x * y) >= sum(y/3)
and sum(x)=1
.
I have tried to use nloptr
:
F <- function(x,y,A){
...
}
Gc <- function(x,y){
return(sum(y/3) - sum(x*y))
}
Hc <- function(x){
retunr(1-sum(x))
}
nloptr(x0=rep(1/3,3), eval_f=F, lb = 0.05, ub = 1, eval_g_ineq = Gc, eval_g_eq = Hc, opts = list(), y=y, A=A)
And I receive an error:
'A' passed to (...) in 'nloptr' but this is not required in the eval_g_ineq function.
If I say nloptr( ... , y, A)
I get:eval_f requires argument 'cov.mat' but this has not been passed to the 'nloptr' function.
Any advice would be great. Thanks
So there are several things going on here:
First, the objective function,
F
, the equality constraint function,Hc
, and the inequality constraint function,Gc
, all have to take the same arguments. So passx, y, A
to all three and just ignore them where they are not needed.Second, you have to define
y
andA
somewhere...Third, you must specify which algorithm to use. Do this with
opts=list(algoritm=...)
. It turns out that if you are (a) using constraints, and (b) not providing functions to calculate the jacobian matrices, then only some of the algorithms are appropriate. In your caseopts=list(algorithm="NLOPT_GN_ISRES")
seems to work.Finally, the default
maxeval = 100
which turns out to be not nearly enough. I had to set it to 100,000 to get convergence.Putting this all together, albeit with a made-up objective function: