Selecting SVM parameters using cross validation an

2019-02-15 06:50发布

I need to keep track of the F1-scores while tuning C & Sigma in SVM, For example the following code keeps track of the Accuracy, I need to change it to F1-Score but I was not able to do that…….

%# read some training data
[labels,data] = libsvmread('./heart_scale');

%# grid of parameters
folds = 5;
[C,gamma] = meshgrid(-5:2:15, -15:2:3);

%# grid search, and cross-validation
cv_acc = zeros(numel(C),1);
    for i=1:numel(C)
cv_acc(i) = svmtrain(labels, data, ...
                sprintf('-c %f -g %f -v %d', 2^C(i), 2^gamma(i), folds));
end

%# pair (C,gamma) with best accuracy
[~,idx] = max(cv_acc);

%# now you can train you model using best_C and best_gamma
best_C = 2^C(idx);
best_gamma = 2^gamma(idx);
%# ...

I have seen the following two links

Retraining after Cross Validation with libsvm

10 fold cross-validation in one-against-all SVM (using LibSVM)

I do understand that I have to first find the best C and gamma/sigma parameters over the training data, then use these two values to do a LEAVE-ONE-OUT crossvalidation classification experiment, So what I want now is to first do a grid-search for tuning C & sigma. Please I would prefer to use MATLAB-SVM and not LIBSVM. Below is my code for LEAVE-ONE-OUT crossvalidation classification.

... clc
 clear all
close all
a = load('V1.csv');
X = double(a(:,1:12));
Y = double(a(:,13));
% train data
datall=[X,Y];
A=datall;
n = 40;
ordering = randperm(n);
B = A(ordering, :);  
good=B; 
input=good(:,1:12);
target=good(:,13);
CVO = cvpartition(target,'leaveout',1);  
cp = classperf(target);                      %# init performance tracker
svmModel=[];
for i = 1:CVO.NumTestSets                                %# for each fold
trIdx = CVO.training(i);              
teIdx = CVO.test(i);                   
%# train an SVM model over training instances

svmModel = svmtrain(input(trIdx,:), target(trIdx), ...
       'Autoscale',true, 'Showplot',false, 'Method','ls', ...
      'BoxConstraint',0.1, 'Kernel_Function','rbf', 'RBF_Sigma',0.1);
%# test using test instances
pred = svmclassify(svmModel, input(teIdx,:), 'Showplot',false);
%# evaluate and update performance object
cp = classperf(cp, pred, teIdx); 
end
%# get accuracy
accuracy=cp.CorrectRate*100
sensitivity=cp.Sensitivity*100
specificity=cp.Specificity*100
PPV=cp.PositivePredictiveValue*100
NPV=cp.NegativePredictiveValue*100
%# get confusion matrix
%# columns:actual, rows:predicted, last-row: unclassified instances
cp.CountingMatrix
recallP = sensitivity;
recallN = specificity;
precisionP = PPV;
precisionN = NPV;
f1P = 2*((precisionP*recallP)/(precisionP + recallP));
f1N = 2*((precisionN*recallN)/(precisionN + recallN));
aF1 = ((f1P+f1N)/2);

i have changed the code but i making some mistakes and i am getting errors,

a = load('V1.csv');
X = double(a(:,1:12));
Y = double(a(:,13));
% train data
datall=[X,Y];
A=datall;
n = 40;
ordering = randperm(n);
B = A(ordering, :);  
good=B; 
inpt=good(:,1:12);
target=good(:,13);
k=10;
cvFolds = crossvalind('Kfold', target, k);   %# get indices of 10-fold CV
cp = classperf(target);                      %# init performance tracker
svmModel=[];
for i = 1:k 
    testIdx = (cvFolds == i);    %# get indices of test    instances
trainIdx = ~testIdx;   
C = 0.1:0.1:1; 
S = 0.1:0.1:1; 
fscores = zeros(numel(C), numel(S)); %// Pre-allocation
for c = 1:numel(C)   
for s = 1:numel(S)
    vals = crossval(@(XTRAIN, YTRAIN, XVAL, YVAL)(fun(XTRAIN, YTRAIN, XVAL, YVAL, C(c), S(c))),inpt(trainIdx,:),target(trainIdx));
    fscores(c,s) = mean(vals);
end
end
 end

[cbest, sbest] = find(fscores == max(fscores(:)));
C_final = C(cbest);
S_final = S(sbest);    

.......

and the function.....

.....
function fscore = fun(XTRAIN, YTRAIN, XVAL, YVAL, C, S)
svmModel = svmtrain(XTRAIN, YTRAIN, ...
   'Autoscale',true, 'Showplot',false, 'Method','ls', ...
  'BoxConstraint', C, 'Kernel_Function','rbf', 'RBF_Sigma', S);

   pred = svmclassify(svmModel, XVAL, 'Showplot',false);

   cp = classperf(YVAL, pred)
   %# get accuracy
    accuracy=cp.CorrectRate*100
    sensitivity=cp.Sensitivity*100
    specificity=cp.Specificity*100
    PPV=cp.PositivePredictiveValue*100
    NPV=cp.NegativePredictiveValue*100
    %# get confusion matrix
    %# columns:actual, rows:predicted, last-row: unclassified instances
    cp.CountingMatrix
    recallP = sensitivity;
    recallN = specificity;
    precisionP = PPV;
    precisionN = NPV;
    f1P = 2*((precisionP*recallP)/(precisionP + recallP));
    f1N = 2*((precisionN*recallN)/(precisionN + recallN));
    fscore = ((f1P+f1N)/2);

    end

2条回答
爷的心禁止访问
2楼-- · 2019-02-15 07:11

I found the only problem with target(trainIdx). It's a row vector so I just replaced target(trainIdx) with target(trainIdx) which is a column vector.

查看更多
孤傲高冷的网名
3楼-- · 2019-02-15 07:20

So basically you want to take this line of yours:

svmModel = svmtrain(input(trIdx,:), target(trIdx), ...
       'Autoscale',true, 'Showplot',false, 'Method','ls', ...
      'BoxConstraint',0.1, 'Kernel_Function','rbf', 'RBF_Sigma',0.1);

put it in a loop that varies your 'BoxConstraint' and 'RBF_Sigma' parameters and then uses crossval to output the f1-score for that iterations combination of parameters.

You can use a single for-loop exactly like in your libsvm code example (i.e. using meshgrid and 1:numel(), this is probably faster) or a nested for-loop. I'll use a nested loop so that you have both approaches:

C = [0.001, 0.003, 0.01, 0.03, 0.1, 0.3, 1, 3, 10, 30, 100, 300] %// you must choose your own set of values for the parameters that you want to test. You can either do it this way by explicitly typing out a list
S = 0:0.1:1 %// or you can do it this way using the : operator
fscores = zeros(numel(C), numel(S)); %// Pre-allocation
for c = 1:numel(C)   
    for s = 1:numel(S)
        vals = crossval(@(XTRAIN, YTRAIN, XVAL, YVAL)(fun(XTRAIN, YTRAIN, XVAL, YVAL, C(c), S(c)),input(trIdx,:),target(trIdx));
        fscores(c,s) = mean(vals);
    end
end

%// Then establish the C and S that gave you the bet f-score. Don't forget that c and s are just indexes though!
[cbest, sbest] = find(fscores == max(fscores(:)));
C_final = C(cbest);
S_final = S(sbest);

Now we just have to define the function fun. The docs have this to say about fun:

fun is a function handle to a function with two inputs, the training subset of X, XTRAIN, and the test subset of X, XTEST, as follows:

testval = fun(XTRAIN,XTEST) Each time it is called, fun should use XTRAIN to fit a model, then return some criterion testval computed on XTEST using that fitted model.

So fun needs to:

  • output a single f-score
  • take as input a training and testing set for X and Y. Note that these are both subsets of your actual training set! Think of them more like a training and validation SUBSET of your training set. Also note that crossval will split these sets up for you!
  • Train a classifier on the training subset (using your current C and S parameters from your loop)
  • RUN your new classifier on the test (or validation rather) subset
  • Compute and output a performance metric (in your case you want the f1-score)

You'll notice that fun can't take any extra parameters which is why I've wrapped it in an anonymous function so that we can pass the current C and S values in. (i.e. all that @(...)(fun(...)) stuff above. That's just a trick to "convert" our six parameter fun into the 4 parameter one required by crossval.

function fscore = fun(XTRAIN, YTRAIN, XVAL, YVAL, C, S)

   svmModel = svmtrain(XTRAIN, YTRAIN, ...
       'Autoscale',true, 'Showplot',false, 'Method','ls', ...
      'BoxConstraint', C, 'Kernel_Function','rbf', 'RBF_Sigma', S);

   pred = svmclassify(svmModel, XVAL, 'Showplot',false);

   CP = classperf(YVAL, pred)

   fscore = ... %// You can do this bit the same way you did earlier
end
查看更多
登录 后发表回答