I have a problem with specifying the learning rate using the caret package with the method "mlpWeightDecay" from RSNNS package. The tuning parameters of "mlpWeightDecay" are size and decay.
An example leaving size constant at 4 and tuning decay over c(0,0.0001, 0.001, 0.002):
data(iris)
TrainData <- iris[,1:4]
TrainClasses <- iris[,5]
fit1 <- train(TrainData, TrainClasses,
method = "mlpWeightDecay",
preProcess = c("center", "scale"),
tuneGrid=expand.grid(.size = 4, .decay = c(0,0.0001, 0.001, 0.002)),
trControl = trainControl(method = "cv")
)
But I also want to manipulate the learning rate of the model and not just taking the default learning rate of 0.2.
I know that I can use further arguments of the mlpWeightDecay method from RSNNS via the "..." parameter. "learnFuncParams" would be the RSNNS parameter I would need to insert. It takes 4 parameters (learning rate, weight decay, dmin, dmax).
Going on with the example it looks like this:
fit1 <- train(TrainData, TrainClasses,
method = "mlpWeightDecay",
preProcess = c("center", "scale"),
tuneGrid=expand.grid(.size = 4, .decay = c(0,0.0001, 0.001, 0.002)),
trControl = trainControl(method = "cv"),
learnFuncParams=c(0.4,0,0,0)
)
BUT the documentation of the caret train function tells me for the "..." parameter:
arguments passed to the classification or regression routine (such as randomForest). Errors will occur if values for tuning parameters are passed here.
The problem is that one of the 4 "learningFuncParams" parameters (weight decay) IS a tuning parameter.
Consequently I get an error and warnings:
Error in train.default(TrainData, TrainClasses, method = "mlpWeightDecay", : final tuning parameters could not be determined In addition: There were 50 or more warnings (use warnings() to see the first 50)
Warning messages:
1: In method$fit(x = if (!is.data.frame(x)) as.data.frame(x) else x, ... : Over-riding weight decay value in the 'learnFuncParams' argument you passed in. Other values are retained
2: In eval(expr, envir, enclos) : model fit failed for Fold01: size=4, decay=0e+00 Error in mlp.default(x = structure(list(Sepal.Length = c(-0.891390168709482, : formal argument "learnFuncParams" matched by multiple actual arguments
How can I set the learning rate without coming in conflicts with the tuning parameter "decay" if both is set in the same parameter "learningFuncParams"?
Thanks!