支持向量机的matlab代码 谁有模糊支持向量机的matlab程序,跪求

\u8bf7\u95ee\u5728matlab\u4e2d\u5982\u4f55\u5b9e\u73b0\u652f\u6301\u5411\u91cf\u673a\uff08SVM\uff09\u7b97\u6cd5\uff1f

matlab\u81ea\u5e26svmtrain\uff0c\u8fdb\u53bb\u770bhelp\uff0c\u7167\u7740\u4f8b\u5b50\u505a\u5c31\u61c2\u4e86

\u4e91\u76d8\u94fe\u63a5\u5df2\u79c1\u4fe1\u53d1\u9001\u7ed9\u4f60\u4e86\uff0c\u4f60\u770b\u4e00\u4e0b\u662f\u4e0d\u662f\u4f60\u9700\u8981\u7684
\u5982\u679c\u8fd8\u6709\u54ea\u4f4d\u670b\u53cb\u60f3\u8981\uff0c\u8bf7\u70b9\u201c\u8d5e\u201d\u6b64\u6761\u56de\u7b54\u4ee5\u540e\uff0c\u5728\u4e0b\u9762\u7684\u8bc4\u8bba\u91cc\u7559\u4e0b\u60a8\u7684\u8054\u7cfb\u65b9\u5f0f
\u5b66\u65e0\u6b62\u5883\uff0c\u5e0c\u671b\u56de\u7b54\u80fd\u7ed9\u4f60\u5e26\u6765\u5e2e\u52a9,\u5982\u679c\u6ee1\u610f\u8bf7\u91c7\u7eb3\uff0c\u4e0d\u6ee1\u610f\u7684\u8bdd\u8bf7\u7ee7\u7eed\u8ffd\u95ee\u3002

如果是7.0以上版本
>>edit svmtrain
>>edit svmclassify
>>edit svmpredict

function [svm_struct, svIndex] = svmtrain(training, groupnames, varargin)
%SVMTRAIN trains a support vector machine classifier
%
% SVMStruct = SVMTRAIN(TRAINING,GROUP) trains a support vector machine
% classifier using data TRAINING taken from two groups given by GROUP.
% SVMStruct contains information about the trained classifier that is
% used by SVMCLASSIFY for classification. GROUP is a column vector of
% values of the same length as TRAINING that defines two groups. Each
% element of GROUP specifies the group the corresponding row of TRAINING
% belongs to. GROUP can be a numeric vector, a string array, or a cell
% array of strings. SVMTRAIN treats NaNs or empty strings in GROUP as
% missing values and ignores the corresponding rows of TRAINING.
%
% SVMTRAIN(...,'KERNEL_FUNCTION',KFUN) allows you to specify the kernel
% function KFUN used to map the training data into kernel space. The
% default kernel function is the dot product. KFUN can be one of the
% following strings or a function handle:
%
% 'linear' Linear kernel or dot product
% 'quadratic' Quadratic kernel
% 'polynomial' Polynomial kernel (default order 3)
% 'rbf' Gaussian Radial Basis Function kernel
% 'mlp' Multilayer Perceptron kernel (default scale 1)
% function A kernel function specified using @,
% for example @KFUN, or an anonymous function
%
% A kernel function must be of the form
%
% function K = KFUN(U, V)
%
% The returned value, K, is a matrix of size M-by-N, where U and V have M
% and N rows respectively. If KFUN is parameterized, you can use
% anonymous functions to capture the problem-dependent parameters. For
% example, suppose that your kernel function is
%
% function k = kfun(u,v,p1,p2)
% k = tanh(p1*(u*v')+p2);
%
% You can set values for p1 and p2 and then use an anonymous function:
% @(u,v) kfun(u,v,p1,p2).
%
% SVMTRAIN(...,'POLYORDER',ORDER) allows you to specify the order of a
% polynomial kernel. The default order is 3.
%
% SVMTRAIN(...,'MLP_PARAMS',[P1 P2]) allows you to specify the
% parameters of the Multilayer Perceptron (mlp) kernel. The mlp kernel
% requires two parameters, P1 and P2, where K = tanh(P1*U*V' + P2) and P1
% > 0 and P2 < 0. Default values are P1 = 1 and P2 = -1.
%
% SVMTRAIN(...,'METHOD',METHOD) allows you to specify the method used
% to find the separating hyperplane. Options are
%
% 'QP' Use quadratic programming (requires the Optimization Toolbox)
% 'LS' Use least-squares method
%
% If you have the Optimization Toolbox, then the QP method is the default
% method. If not, the only available method is LS.
%
% SVMTRAIN(...,'QUADPROG_OPTS',OPTIONS) allows you to pass an OPTIONS
% structure created using OPTIMSET to the QUADPROG function when using
% the 'QP' method. See help optimset for more details.
%
% SVMTRAIN(...,'SHOWPLOT',true), when used with two-dimensional data,
% creates a plot of the grouped data and plots the separating line for
% the classifier.
%
% Example:
% % Load the data and select features for classification
% load fisheriris
% data = [meas(:,1), meas(:,2)];
% % Extract the Setosa class
% groups = ismember(species,'setosa');
% % Randomly select training and test sets
% [train, test] = crossvalind('holdOut',groups);
% cp = classperf(groups);
% % Use a linear support vector machine classifier
% svmStruct = svmtrain(data(train,:),groups(train),'showplot',true);
% classes = svmclassify(svmStruct,data(test,:),'showplot',true);
% % See how well the classifier performed
% classperf(cp,classes,test);
% cp.CorrectRate
%
% See also CLASSIFY, KNNCLASSIFY, QUADPROG, SVMCLASSIFY.

% Copyright 2004 The MathWorks, Inc.
% $Revision: 1.1.12.1 $ $Date: 2004/12/24 20:43:35 $

% References:
% [1] Kecman, V, Learning and Soft Computing,
% MIT Press, Cambridge, MA. 2001.
% [2] Suykens, J.A.K., Van Gestel, T., De Brabanter, J., De Moor, B.,
% Vandewalle, J., Least Squares Support Vector Machines,
% World Scientific, Singapore, 2002.
% [3] Scholkopf, B., Smola, A.J., Learning with Kernels,
% MIT Press, Cambridge, MA. 2002.

%
% SVMTRAIN(...,'KFUNARGS',ARGS) allows you to pass additional
% arguments to kernel functions.

% set defaults

plotflag = false;
qp_opts = [];
kfunargs = {};
setPoly = false; usePoly = false;
setMLP = false; useMLP = false;
if ~isempty(which('quadprog'))
useQuadprog = true;
else
useQuadprog = false;
end
% set default kernel function
kfun = @linear_kernel;

% check inputs
if nargin < 2
error(nargchk(2,Inf,nargin))
end

numoptargs = nargin -2;
optargs = varargin;

% grp2idx sorts a numeric grouping var ascending, and a string grouping
% var by order of first occurrence

[g,groupString] = grp2idx(groupnames);

% check group is a vector -- though char input is special...
if ~isvector(groupnames) && ~ischar(groupnames)
error('Bioinfo:svmtrain:GroupNotVector',...
'Group must be a vector.');
end

% make sure that the data is correctly oriented.
if size(groupnames,1) == 1
groupnames = groupnames';
end
% make sure data is the right size
n = length(groupnames);
if size(training,1) ~= n
if size(training,2) == n
training = training';
else
error('Bioinfo:svmtrain:DataGroupSizeMismatch',...
'GROUP and TRAINING must have the same number of rows.')
end
end

% NaNs are treated as unknown classes and are removed from the training
% data
nans = find(isnan(g));
if length(nans) > 0
training(nans,:) = [];
g(nans) = [];
end
ngroups = length(groupString);

if ngroups > 2
error('Bioinfo:svmtrain:TooManyGroups',...
'SVMTRAIN only supports classification into two groups.\nGROUP contains %d different groups.',ngroups)
end
% convert to 1, -1.
g = 1 - (2* (g-1));

% handle optional arguments

if numoptargs >= 1
if rem(numoptargs,2)== 1
error('Bioinfo:svmtrain:IncorrectNumberOfArguments',...
'Incorrect number of arguments to %s.',mfilename);
end
okargs = {'kernel_function','method','showplot','kfunargs','quadprog_opts','polyorder','mlp_params'};
for j=1:2:numoptargs
pname = optargs{j};
pval = optargs{j+1};
k = strmatch(lower(pname), okargs);%#ok
if isempty(k)
error('Bioinfo:svmtrain:UnknownParameterName',...
'Unknown parameter name: %s.',pname);
elseif length(k)>1
error('Bioinfo:svmtrain:AmbiguousParameterName',...
'Ambiguous parameter name: %s.',pname);
else
switch(k)
case 1 % kernel_function
if ischar(pval)
okfuns = {'linear','quadratic',...
'radial','rbf','polynomial','mlp'};
funNum = strmatch(lower(pval), okfuns);%#ok
if isempty(funNum)
funNum = 0;
end
switch funNum %maybe make this less strict in the future
case 1
kfun = @linear_kernel;
case 2
kfun = @quadratic_kernel;
case {3,4}
kfun = @rbf_kernel;
case 5
kfun = @poly_kernel;
usePoly = true;
case 6
kfun = @mlp_kernel;
useMLP = true;
otherwise
error('Bioinfo:svmtrain:UnknownKernelFunction',...
'Unknown Kernel Function %s.',kfun);
end
elseif isa (pval, 'function_handle')
kfun = pval;
else
error('Bioinfo:svmtrain:BadKernelFunction',...
'The kernel function input does not appear to be a function handle\nor valid function name.')
end
case 2 % method
if strncmpi(pval,'qp',2)
useQuadprog = true;
if isempty(which('quadprog'))
warning('Bioinfo:svmtrain:NoOptim',...
'The Optimization Toolbox is required to use the quadratic programming method.')
useQuadprog = false;
end
elseif strncmpi(pval,'ls',2)
useQuadprog = false;
else
error('Bioinfo:svmtrain:UnknownMethod',...
'Unknown method option %s. Valid methods are ''QP'' and ''LS''',pval);

end
case 3 % display
if pval ~= 0
if size(training,2) == 2
plotflag = true;
else
warning('Bioinfo:svmtrain:OnlyPlot2D',...
'The display option can only plot 2D training data.')
end

end
case 4 % kfunargs
if iscell(pval)
kfunargs = pval;
else
kfunargs = {pval};
end
case 5 % quadprog_opts
if isstruct(pval)
qp_opts = pval;
elseif iscell(pval)
qp_opts = optimset(pval{:});
else
error('Bioinfo:svmtrain:BadQuadprogOpts',...
'QUADPROG_OPTS must be an opts structure.');
end
case 6 % polyorder
if ~isscalar(pval) || ~isnumeric(pval)
error('Bioinfo:svmtrain:BadPolyOrder',...
'POLYORDER must be a scalar value.');
end
if pval ~=floor(pval) || pval < 1
error('Bioinfo:svmtrain:PolyOrderNotInt',...
'The order of the polynomial kernel must be a positive integer.')
end
kfunargs = {pval};
setPoly = true;

case 7 % mlpparams
if numel(pval)~=2
error('Bioinfo:svmtrain:BadMLPParams',...
'MLP_PARAMS must be a two element array.');
end
if ~isscalar(pval(1)) || ~isscalar(pval(2))
error('Bioinfo:svmtrain:MLPParamsNotScalar',...
'The parameters of the multi-layer perceptron kernel must be scalar.');
end
kfunargs = {pval(1),pval(2)};
setMLP = true;
end
end
end
end
if setPoly && ~usePoly
warning('Bioinfo:svmtrain:PolyOrderNotPolyKernel',...
'You specified a polynomial order but not a polynomial kernel');
end
if setMLP && ~useMLP
warning('Bioinfo:svmtrain:MLPParamNotMLPKernel',...
'You specified MLP parameters but not an MLP kernel');
end
% plot the data if requested
if plotflag
[hAxis,hLines] = svmplotdata(training,g);
legend(hLines,cellstr(groupString));
end

% calculate kernel function
try
kx = feval(kfun,training,training,kfunargs{:});
% ensure function is symmetric
kx = (kx+kx')/2;
catch
error('Bioinfo:svmtrain:UnknownKernelFunction',...
'Error calculating the kernel function:\n%s\n', lasterr);
end
% create Hessian
% add small constant eye to force stability
H =((g*g').*kx) + sqrt(eps(class(training)))*eye(n);

if useQuadprog
% The large scale solver cannot handle this type of problem, so turn it
% off.
qp_opts = optimset(qp_opts,'LargeScale','Off');
% X=QUADPROG(H,f,A,b,Aeq,beq,LB,UB,X0,opts)
alpha = quadprog(H,-ones(n,1),[],[],...
g',0,zeros(n,1),inf *ones(n,1),zeros(n,1),qp_opts);

% The support vectors are the non-zeros of alpha
svIndex = find(alpha > sqrt(eps));
sv = training(svIndex,:);

% calculate the parameters of the separating line from the support
% vectors.
alphaHat = g(svIndex).*alpha(svIndex);

% Calculate the bias by applying the indicator function to the support
% vector with largest alpha.
[maxAlpha,maxPos] = max(alpha); %#ok
bias = g(maxPos) - sum(alphaHat.*kx(svIndex,maxPos));
% an alternative method is to average the values over all support vectors
% bias = mean(g(sv)' - sum(alphaHat(:,ones(1,numSVs)).*kx(sv,sv)));

% An alternative way to calculate support vectors is to look for zeros of
% the Lagrangians (fifth output from QUADPROG).
%
% [alpha,fval,output,exitflag,t] = quadprog(H,-ones(n,1),[],[],...
% g',0,zeros(n,1),inf *ones(n,1),zeros(n,1),opts);
%
% sv = t.lower < sqrt(eps) & t.upper < sqrt(eps);
else % Least-Squares
% now build up compound matrix for solver

A = [0 g';g,H];
b = [0;ones(size(g))];
x = A\b;

% calculate the parameters of the separating line from the support
% vectors.
sv = training;
bias = x(1);
alphaHat = g.*x(2:end);
end

svm_struct.SupportVectors = sv;
svm_struct.Alpha = alphaHat;
svm_struct.Bias = bias;
svm_struct.KernelFunction = kfun;
svm_struct.KernelFunctionArgs = kfunargs;
svm_struct.GroupNames = groupnames;
svm_struct.FigureHandles = [];
if plotflag
hSV = svmplotsvs(hAxis,svm_struct);
svm_struct.FigureHandles = {hAxis,hLines,hSV};
end

  • matlab涓real鍜宑omplex鏈変粈涔堜綔鐢,鍙婅缃柟娉
    绛旓細c = complex(a,b)琛ㄧずc = a + bi X = real(Z)锛岃〃绀猴紝鍙朲鐨勫疄鏁伴儴鍒
  • 鏀寔鍚戦噺鏈鍩烘湰鍘熺悊 matlab绋嬪簭鍙婂叾搴旂敤
    绛旓細鏀寔鍚戦噺鏈鍩烘湰鍘熺悊 matlab绋嬪簭鍙婂叾搴旂敤 杩欏嚑澶╄繘琛岃绠楁満瀹炰範,鑰佸笀灏辩粰浜嗚繖涓棰,瀵逛簬鎴戜滑杩欎簺杩瀖atlab鑰冭瘯閮借寰楅毦鐨勪汉鏉ヨ绠鐩村氨鏄刀楦瓙涓婃灦,鍙兘姹傚姪澶х浜,鍝綅澶х鍙互閫氫織鐨勮璁瞫vm鐨勫師鐞,浠ュ強鎬庝箞杩愮敤瀹,杩樻湁... 杩欏嚑澶╄繘琛岃绠楁満瀹炰範,鑰佸笀灏辩粰浜嗚繖涓棰,瀵逛簬鎴戜滑杩欎簺杩瀖atlab鑰冭瘯閮借寰楅毦鐨勪汉鏉ヨ绠鐩...
  • Matlab MPC妯″潡鐨勪娇鐢ㄦ柟娉
    绛旓細1銆佸湪matlab鍛戒护绐楀彛涓紝鍙互鐩存帴閿叆鍛戒护鈥渕pctool鈥滐紝鍥惧舰涓乏渚х殑涓変釜鏍忕洰鍒嗗埆涓哄彈鎺у璞℃ā鍨 銆佹ā鍨嬮娴嬫帶鍒跺櫒 銆佷豢鐪熷櫒銆2銆佸彈鎺у璞℃ā鍨嬬殑杈撳叆锛屽埄鐢ㄥ浘涓殑鑿滃崟鍛戒护鈥淢PC鈥濃啋鈥淚mport鈥濇垨[Import Plant]鎸夐挳銆3銆佸彲浠ユ墦寮鍙楁帶瀵硅薄鐨勬ā鍨嬭緭鍏ョ獥鍙o紝鍒╃敤璇ョ獥鍙o紝鍙互閫夋嫨杈撳叆鍦MATLAB绐楀彛鐨凩TI瀵硅薄銆4銆...
  • MATLAB 褰掍竴鍖 鍑芥暟鐢ㄦ硶浠ュ強瀹炰緥
    绛旓細鍦ㄧ敤MATLAB杩涜鍚勭鍥炲綊杩愮畻浠ュ強鐭╅樀杩愮畻涓紝涓轰簡閬垮厤涓嶅悓鏁伴噺绾х殑鏁板瓧涔嬮棿鐩镐簰褰卞搷锛岄槻姝㈠ぇ鏁板悆灏忔暟绛夋儏鍐碉紝鎴戜滑闇瑕佸鍏惰繘琛屽綊涓鍖栵紝涓嬮潰鎴戜滑灏变粙缁嶅嚑绉嶅父鐢ㄧ殑褰掍竴鍖栨柟娉曪紝骞堕氳繃瀹炰緥杩涜浠嬬粛銆俶apminmax 绀轰緥 杩欎釜鍑芥暟鏄崄鍒嗗父鐢ㄧ殑褰掍竴鍖栧嚱鏁帮紝鏈甯哥敤鐨勬槸杩涜澶氬厓鍥炲綊锛屽寘鎷缁忕綉缁滀互鍙鏀寔鍚戦噺鏈鍥炲綊杩囩▼褰撲腑...
  • 濡備綍鍦matlab涓瀹氫箟n缁鍚戦噺?
    绛旓細鍚戦噺鐨勭浉鍏崇煡璇2011-12-29 鍚戦噺鐨勫畾涔 8 2007-04-08 鏀寔鍚戦噺鏈虹殑matlab浠g爜 74 2009-08-09 鍏充簬骞抽潰鍚戦噺鐨勫叕寮 3027 2010-12-24 鍚戦噺涓夎褰㈤ 30 2012-07-06 瀵逛换鎰忎袱涓潪闆跺悜閲徫蔽,瀹氫箟伪鈥晃=伪路尾/尾路尾 81 鏇村鍏充簬鍚戦噺鐨勭煡璇 > 缃戝弸閮藉湪鎵: matlab涓浣曞畾涔塶缁磋鍚戦噺 MATLAB琛屽悜閲 ...
  • 濡備綍鐢matlab姹傜煩闃典腑姣忚闈為浂鍏冪礌鐨勫钩鍧囧?
    绛旓細鑰冭檻鍒扮煩闃垫暟閲忚緝澶氾紝鐢ㄥ厓鑳炴暟缁勫鐞嗚緝鏈夐氱敤鎬с侻ATLAB鍜孧athematica銆丮aple骞剁О涓轰笁澶ф暟瀛﹁蒋浠躲傚畠鍦ㄦ暟瀛︾被绉戞妧搴旂敤杞欢涓湪鏁板艰绠楁柟闈㈤灞堜竴鎸囥傝鐭╅樀杩愮畻銆佺粯鍒跺嚱鏁板拰鏁版嵁銆佸疄鐜扮畻娉曘佸垱寤虹敤鎴风晫闈佽繛鎺ュ叾浠栫紪绋嬭瑷鐨勭▼搴忕瓑銆MATLAB鐨鍩烘湰鏁版嵁鍗曚綅鏄煩闃点傚畠鐨勬寚浠よ〃杈惧紡涓庢暟瀛︺佸伐绋嬩腑甯哥敤鐨勫舰寮忓崄鍒...
  • 濡備綍瀵瑰浘鍍忓仛鍒嗙被鍣ㄨ缁matlab浠g爜
    绛旓細trainImageCategoryClassifier鍑芥暟杩斿洖涓涓浘鍍忓垎绫诲櫒銆傝鏂规硶浣跨敤鍩轰簬2鍒嗙被鏀寔鍚戦噺鏈锛圫VM锛夌殑error-correcting output codes(ECOC)妗嗘灦鏉ヨ缁冧竴涓鍒嗙被鍣ㄣ傝鏂规硶鍒╃敤bagOfFeatures瀵硅薄杩斿洖鐨勮瑙夎瘝琚嬪皢鍥惧儚闆嗕腑鐨勫浘鍍忕紪鐮佹垚瑙嗚璇嶇洿鏂瑰浘銆傜劧鍚庡皢瑙嗚璇嶇洿鏂瑰浘浣滀负璁粌鍒嗙被鍣ㄧ殑姝h礋鏍锋湰銆1銆佸皢璁粌闆嗕腑鐨勬瘡骞...
  • ...姣斿棰勬祴浜哄彛鏁般佷环鏍肩瓑,闅忎究缁欎釜matlab婧愪唬鐮
    绛旓細http://zhidao.baidu.com/question/203592147.html
  • matlab閲宻ub2ind鍜宨nd2sub鎬庝箞鐢,椤轰究瑙i噴涓涓嬭繖涓噷鐨剆ub2ind 鍜宨nd2sub...
    绛旓細鍦ㄥ懡浠ょ獥鍙h緭鍏ワ細>> A=[4 7 2 9 8;3 9 1 4 3;1 5 9 6 4;8 3 7 1 0]A = 4 7 2 9 8 3 9 1 4 3 1 5 9 6 4 8 3 7 1 0 鍒橝涓瘡涓厓绱犲搴旂殑绱㈠紩濡備笅锛MATLAB涓鏁版嵁鏄寜鍒楃殑鏂瑰紡瀛樺偍鐨勶級锛1 5 9 13 ...
  • matlab楂樻墜姹傚姪
    绛旓細杩欎釜鏄粬鐨勭▼搴忛棶棰樸侺ine: 173 Column: 7 Incomplete or misformed expression or statement.杩欐槸璇彞涓嶅畬鏁寸殑鎰忔濓紝鍙兘鍦ㄥ垎浜繃绋嬩腑灏戜簡涓偣鍙锋垨鑰呰繍绠楃浠涔堢殑
  • 扩展阅读:ai智能自动写代码 ... 免费matlab代码资源网站 ... 免费代码生成器 ... 免费找matlab代码的网站 ... ai写代码网站matlab ... matlab代码免费获取社区 ... matlab代码生成器 ... svm支持向量机matlab代码 ... 支持向量机预测matlab代码 ...

    本站交流只代表网友个人观点,与本站立场无关
    欢迎反馈与建议,请联系电邮
    2024© 车视网