gpt4 book ai didi

matlab - libsvm:使用留一法评估 SVM

转载 作者:行者123 更新时间:2023-11-30 08:52:41 25 4
gpt4 key购买 nike

我正在尝试将 libsvm 与 MATLAB 结合使用来评估一对多 SVM,唯一的问题是我的数据集不够大,无法保证选择特定的测试集。因此,我想使用留一法来评估我的分类器。

我在使用 SVM 方面并不是特别有经验,所以如果我对该怎么做有点困惑,请原谅我。我需要为我的分类器生成精度与召回率曲线和混淆矩阵,但我不知道从哪里开始。

我已经尝试过,并提出了以下内容作为休假培训的粗略开始,但我不确定如何进行评估。

function model = do_leave_one_out(labels, data)
acc = [];
bestC = [];
bestG = [];
for ii = 1:length(data)
% Training data for this iteration
trainData = data;
trainData(ii) = [];
looLabel = labels(ii);
trainLabels = labels;
trainLabels(ii) = [];

% Do grid search to find the best parameters?

acc(ii) = bestReportedAccuracy;
bestC(ii) = bestValueForC;
bestG(ii) = bestValueForG;
end
% After this I am not sure how to train and evaluate the final model
end

最佳答案

我正在尝试提供一些您可能感兴趣的模块,您可以将它们合并到您的函数中。希望对您有所帮助。

留一:

scrambledList = randperm(totalNumberOfData);
trainingData = Data(scrambledList(1:end-1),:);
trainingLabel = Label(scrambledList(1:end-1));
testData = Data(scrambledList(end),:);
testLabel = Label(scrambledList(end));

网格搜索(双类情况):

acc = 0;
for log2c = -1:3,
for log2g = -4:1,
cmd = ['-v 5 -c ', num2str(2^log2c), ' -g ', num2str(2^log2g)];
cv = svmtrain(trainingLabel, trainingData, cmd);
if (cv >= acc),
acc = cv; bestC = 2^log2c; bestG = 2^log2g;
end
end
end

一对多(用于多类案例):

model = cell(NumofClass,1);
for k = 1:NumofClass
model{k} = svmtrain(double(trainingLabel==k), trainingData, '-c 1 -g 0.2 -b 1');
end

%% calculate the probability of different labels

pr = zeros(1,NumofClass);
for k = 1:NumofClass
[~,~,p] = svmpredict(double(testLabel==k), testData, model{k}, '-b 1');
pr(:,k) = p(:,model{k}.Label==1); %# probability of class==k
end

%% your label prediction will be the one with highest probability:

[~,predctedLabel] = max(pr,[],2);

关于matlab - libsvm:使用留一法评估 SVM,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/21599448/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com