caret::train: specify further non-tuning parameter

2019-03-22 06:18发布

I have a problem with specifying the learning rate using the caret package with the method "mlpWeightDecay" from RSNNS package. The tuning parameters of "mlpWeightDecay" are size and decay.

An example leaving size constant at 4 and tuning decay over c(0,0.0001, 0.001, 0.002):

data(iris)
TrainData <- iris[,1:4]
TrainClasses <- iris[,5]

fit1 <- train(TrainData, TrainClasses,
            method = "mlpWeightDecay",
            preProcess = c("center", "scale"),
            tuneGrid=expand.grid(.size = 4, .decay = c(0,0.0001, 0.001, 0.002)),
            trControl = trainControl(method = "cv")
)

But I also want to manipulate the learning rate of the model and not just taking the default learning rate of 0.2.

I know that I can use further arguments of the mlpWeightDecay method from RSNNS via the "..." parameter. "learnFuncParams" would be the RSNNS parameter I would need to insert. It takes 4 parameters (learning rate, weight decay, dmin, dmax).

Going on with the example it looks like this:

fit1 <- train(TrainData, TrainClasses,
    method = "mlpWeightDecay",
    preProcess = c("center", "scale"),
    tuneGrid=expand.grid(.size = 4, .decay = c(0,0.0001, 0.001, 0.002)),
    trControl = trainControl(method = "cv"),
    learnFuncParams=c(0.4,0,0,0)
)

BUT the documentation of the caret train function tells me for the "..." parameter:
arguments passed to the classification or regression routine (such as randomForest). Errors will occur if values for tuning parameters are passed here.

The problem is that one of the 4 "learningFuncParams" parameters (weight decay) IS a tuning parameter.

Consequently I get an error and warnings:

Error in train.default(TrainData, TrainClasses, method = "mlpWeightDecay", : final tuning parameters could not be determined In addition: There were 50 or more warnings (use warnings() to see the first 50)

Warning messages:

1: In method$fit(x = if (!is.data.frame(x)) as.data.frame(x) else x, ... : Over-riding weight decay value in the 'learnFuncParams' argument you passed in. Other values are retained

2: In eval(expr, envir, enclos) : model fit failed for Fold01: size=4, decay=0e+00 Error in mlp.default(x = structure(list(Sepal.Length = c(-0.891390168709482, : formal argument "learnFuncParams" matched by multiple actual arguments

How can I set the learning rate without coming in conflicts with the tuning parameter "decay" if both is set in the same parameter "learningFuncParams"?

Thanks!

1条回答
迷人小祖宗
2楼-- · 2019-03-22 06:58

It looks like you can specify your own learnFuncParams in "...". caret checks if you've provided your own set of parameters and will only override learnFuncParams[3] (which is the decay). It will take the learnFuncParams[1,2,4] that you have provided.

A very convenient way to find out what caret does is to type getModelInfo("mlpWeightDecay") and then scroll up to the $mlpWeightDecay$fit part. It shows how caret will call the real training function:

$mlpWeightDecay$fit
    if (any(names(theDots) == "learnFuncParams")) {
        prms <- theDots$learnFuncParams
        prms[3] <- param$decay
        warning("Over-riding weight decay value in the 'learnFuncParams' argument you passed in. Other values are retained")
    }

It checks if you've provided your own learnFuncParams. If you did, it uses it, but inserts its own decay. You can ignore the warning.

I think the error you've got ("final tuning parameters could not be determined") has another reason. Have you tried a lower learning rate?

查看更多
登录 后发表回答