📝个人主页:研学社的博客
💥💥💞💞欢迎来到本博客❤️❤️💥💥
🏆博主优势:🌞🌞🌞博客内容尽量做到思维缜密,逻辑清晰,为了方便读者。
⛳️座右铭:行百里者,半于九十。
目录
近年来,Suykens等人提出了一种新的SVM方法——最小二乘支持向量机(LSSVM),它把 SVM的学习问题转化为线性方程组的求解问题,极大地减少了SVM中求解约束二次凸规划带来的计算复杂度﹐而LSSVM的数值稳定性和容量控制的策略使得核函数矩阵在非正定的情况下也能取得良好的效果.
部分代码:
%% dataset loop
for i=1:length(dataset_names)
dataset_name = dataset_names{i};
load(sprintf('../dataset/classification/%s.mat', dataset_name))
%% experiment loop
for j=1:n_repetition
%% load/shuffle/divide/normalize dataset
data = dataset(i);
combinations = combnk(1:size(data.x_train,2),2);
for k=1:size(combinations,1)
% get patterns with two attributes
x_train = data.x_train(:,combinations(k,:));
%% train
[~,y_train_n] = max(data.y_train,[],2);
y_train_n(y_train_n==2) = -1;
params = cross_lssvm(x_train, y_train_n, params);
model = lssvm_train(x_train, y_train_n, params);
%% plot decision surface
x_min = min(x_train);
x_max = max(x_train);
[x, y] = meshgrid(linspace(x_min(1), x_max(1)), linspace(x_min(2),x_max(2)));
image_size = size(x);
xy = [x(:) y(:)];
y_hat = lssvm_predict(xy, model);
decisionmap = reshape(y_hat, image_size);
figure,
img = imagesc([x_min(1) x_max(1)],[x_min(2) x_max(2)],decisionmap);
hold on;
set(gca,'ydir','normal');
cmap = [1 0.8 0.8; 0.95 1 0.95; 0.9 0.9 1];%clc
colormap(cmap);
[~,y_test_n] = max(data.y_train,[],2);
plot(x_train(y_test_n == 2, 1),x_train(y_test_n == 2, 2),'r*');
plot(x_train(y_test_n == 1, 1),x_train(y_test_n == 1, 2),'b*');
legend({'class 1', 'class 2', 'class 3'});
title(upper(dataset_names{i}));
xlabel(sprintf('feature %d', combinations(k,1)));
ylabel(sprintf('feature %d', combinations(k,2)));
hold off;
end
end
end
部分理论来源于网络,如有侵权请联系删除。
[1]邓佳佳,刘爽.基于免疫模糊聚类的LSSVM在短期负荷预测中的应用[J].河北大学学报:自然科学版,2012,32(3):234-239