1、% PLATT, J.C. (1998).% Fast training of support vector machines using sequential minimal% optimization. In Schlkopf, B., Burges, C., and Smola, A.J., editors,% Advances in Kernel Methods: Support Vector Learning, chapter12, % pages 185-208. MIT Press, Cambridge, Massachusetts.% History : May 15/2001
2、 - v1.00if size(Y, 2) = 1 | isreal(Y)error(y must be a real double precision column vector);endn = size(Y, 1);if n = size(X, 1)x and y must have the same number of rowsif (nargin7) % check correct number of argumentshelp svcreturn;end;if nargin = 4 & isa(C, svc)net = C;C = get(net,Ckernel = get(net,
3、kernelelseif nargin 4, C = Inf; end; 5, kernel = linear;NOBIAS = 0;switch nargincase 5if ischar(kernel) & strcmp(kernel,NOBIAS = 1;case 6if ischar(alpha_init) & strcmp(alpha_init,) case 7if ischar(bias_init) & strcmp(bias_init,if nargin = 7if n = size(alpha_init, 1)alpha must be a real double precis
4、ion column vector with the same size as yif any(alpha_init 0) | examineAllif examineAll%Loop sobre todos los puntosfor i = 1:ntpnumChanged = numChanged + examineExample(i);%Loop sobre KKT points%Solo los puntos que violan las condiciones KKTif (SMO.alpha(i)SMO.epsilon) & (SMO.alpha(i)(SMO.C-SMO.epsi
5、lon)if (examineAll = 1)examineAll = 0;elseif (numChanged = 0)examineAll = 1;epoch = epoch+1;% trerror = 1; %100*sum(error)0: %d, 0alphaSMO.epsilon);function RESULT = nonBoundLagrangeMultipliers;RESULT = sum(SMO.alpha (SMO.alpha SMO.epsilon) & (alpha2 (SMO.C-SMO.epsilon)e2 = SMO.error(i2);e2 = fwd(i2
6、) - y2;% r2 0. r2 = f2*y2-1r2 = e2*y2;%KKT conditions:% r20 and alpha2=0 (well classified)% r2=0 and 0alpha2C (support vectors at margins)% r20 and alpha2=C (support vectors between margins)% Test the KKT conditions for the current i2 point. % If a point is well classified its alpha must be 0 or if%
7、 it is out of its margin its alpha must be C. If it is at margin% its alpha must be between 0C.%take action only if i2 violates Karush-Kuhn-Tucker conditionsif (r2 SMO.tolerance) & (alpha2 SMO.epsilon)% If it doenst violate KKT conditions then exit, otherwise continue.%Try i2 by three ways; if succe
8、ssful, then immediately return 1;RESULT = 1;% First the routine tries to find an i1 lagrange multiplier that% maximizes the measure |E1-E2|. As large this value is as bigger % the dual objective function becames.% In this first test, only support vectors will be tested.POS = find(SMO.alpha (SMO.alph
9、a (SMO.alpha(i1) %no progress possibleRESULT = 0;function RESULT = takeStep(i1, i2, e2)% for a pair of alpha indexes, verify if it is possible to execute% the optimisation described by Platt.if (i1 = i2), return;% compute upper and lower constraints, L and H, on multiplier a2alpha1 = SMO.alpha(i1);
10、alpha2 = SMO.alpha(i2);x1 = SMO.x(i1); x2 = SMO.x(i2);y1 = SMO.y(i1);C = SMO.C; K = SMO.Kcache;s = y1*y2;if (y1 = y2)L = max(0, alpha2-alpha1); H = min(C, alpha2-alpha1+C);L = max(0, alpha1+alpha2-C); H = min(C, alpha1+alpha2);if (L = H), return;if (alpha1 (alpha1 % e2 = SMO.error(i2);%else% e2 = fw
11、d(i2) - y2;%end;%compute etak11 = K(i1,i1); k12 = K(i1,i2); k22 = K(i2,i2);eta = 2.0*k12-k11-k22;%recompute Lagrange multiplier for pattern i2if (eta 0.0)a2 = alpha2 - y2*(e1 - e2)/eta;%constrain a2 to lie between L and Hif (a2 H)a2 = H;%When eta is not negative, the objective function W should be%e
12、valuated at each end of the line segment. Only those terms in the%objective function that depend on alpha2 need be evaluated. ind = find(SMO.alpha0);aa2 = L; aa1 = alpha1 + s*(alpha2-aa2);Lobj = aa1 + aa2 + sum(-y1*aa1/2).*SMO.y(ind).*K(ind,i1) + (-y2*aa2/2).*SMO.y(ind).*K(ind,i2);aa2 = H;Hobj = aa1
13、 + aa2 + sum(-y1*aa1/2).*SMO.y(ind).*K(ind,i1) + (-y2*aa2/2).*SMO.y(ind).*K(ind,i2);if (LobjHobj+SMO.epsilon)elseif (LobjHobj-SMO.epsilon)a2 = alpha2;if (abs(a2-alpha2) (a1 (a2SMO.bias = b2;SMO.bias = (b1 + b2)/2;% update error cache using new Lagrange multipliersSMO.error = SMO.error + w1*K(:,i1) + w2*K(:,i2) + bold - SMO.bias;SMO.error(i1) = 0.0; SMO.error(i2) = 0.0;% update vector of Lagrange multipliersSMO.alpha(i1) = a1; SMO
copyright@ 2008-2022 冰豆网网站版权所有
经营许可证编号:鄂ICP备2022015515号-1