ImageVerifierCode 换一换
格式:DOCX , 页数:10 ,大小:18.17KB ,
资源ID:21856631      下载积分:3 金币
快捷下载
登录下载
邮箱/手机:
温馨提示:
快捷下载时,用户名和密码都是您填写的邮箱或者手机号,方便查询和重复下载(系统自动生成)。 如填写123,账号就是123,密码也是123。
特别说明:
请自助下载,系统不会自动发送文件的哦; 如果您已付费,想二次下载,请登录后访问:我的下载记录
支付方式: 支付宝    微信支付   
验证码:   换一换

加入VIP,免费下载
 

温馨提示:由于个人手机设置不同,如果发现不能下载,请复制以下地址【https://www.bdocx.com/down/21856631.html】到电脑端继续下载(重复下载不扣费)。

已注册用户请登录:
账号:
密码:
验证码:   换一换
  忘记密码?
三方登录: 微信登录   QQ登录  

下载须知

1: 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。
2: 试题试卷类文档,如果标题没有明确说明有答案则都视为没有答案,请知晓。
3: 文件的所有权益归上传用户所有。
4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
5. 本站仅提供交流平台,并不能对任何下载内容负责。
6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

版权提示 | 免责声明

本文(SMO算法伪代码Word文档下载推荐.docx)为本站会员(b****6)主动上传,冰豆网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即通知冰豆网(发送邮件至service@bdocx.com或直接QQ联系客服),我们立即给予删除!

SMO算法伪代码Word文档下载推荐.docx

1、% PLATT, J.C. (1998).% Fast training of support vector machines using sequential minimal% optimization. In Schlkopf, B., Burges, C., and Smola, A.J., editors,% Advances in Kernel Methods: Support Vector Learning, chapter12, % pages 185-208. MIT Press, Cambridge, Massachusetts.% History : May 15/2001

2、 - v1.00if size(Y, 2) = 1 | isreal(Y)error(y must be a real double precision column vector);endn = size(Y, 1);if n = size(X, 1)x and y must have the same number of rowsif (nargin7) % check correct number of argumentshelp svcreturn;end;if nargin = 4 & isa(C, svc)net = C;C = get(net,Ckernel = get(net,

3、kernelelseif nargin 4, C = Inf; end; 5, kernel = linear;NOBIAS = 0;switch nargincase 5if ischar(kernel) & strcmp(kernel,NOBIAS = 1;case 6if ischar(alpha_init) & strcmp(alpha_init,) case 7if ischar(bias_init) & strcmp(bias_init,if nargin = 7if n = size(alpha_init, 1)alpha must be a real double precis

4、ion column vector with the same size as yif any(alpha_init 0) | examineAllif examineAll%Loop sobre todos los puntosfor i = 1:ntpnumChanged = numChanged + examineExample(i);%Loop sobre KKT points%Solo los puntos que violan las condiciones KKTif (SMO.alpha(i)SMO.epsilon) & (SMO.alpha(i)(SMO.C-SMO.epsi

5、lon)if (examineAll = 1)examineAll = 0;elseif (numChanged = 0)examineAll = 1;epoch = epoch+1;% trerror = 1; %100*sum(error)0: %d, 0alphaSMO.epsilon);function RESULT = nonBoundLagrangeMultipliers;RESULT = sum(SMO.alpha (SMO.alpha SMO.epsilon) & (alpha2 (SMO.C-SMO.epsilon)e2 = SMO.error(i2);e2 = fwd(i2

6、) - y2;% r2 0. r2 = f2*y2-1r2 = e2*y2;%KKT conditions:% r20 and alpha2=0 (well classified)% r2=0 and 0alpha2C (support vectors at margins)% r20 and alpha2=C (support vectors between margins)% Test the KKT conditions for the current i2 point. % If a point is well classified its alpha must be 0 or if%

7、 it is out of its margin its alpha must be C. If it is at margin% its alpha must be between 0C.%take action only if i2 violates Karush-Kuhn-Tucker conditionsif (r2 SMO.tolerance) & (alpha2 SMO.epsilon)% If it doenst violate KKT conditions then exit, otherwise continue.%Try i2 by three ways; if succe

8、ssful, then immediately return 1;RESULT = 1;% First the routine tries to find an i1 lagrange multiplier that% maximizes the measure |E1-E2|. As large this value is as bigger % the dual objective function becames.% In this first test, only support vectors will be tested.POS = find(SMO.alpha (SMO.alph

9、a (SMO.alpha(i1) %no progress possibleRESULT = 0;function RESULT = takeStep(i1, i2, e2)% for a pair of alpha indexes, verify if it is possible to execute% the optimisation described by Platt.if (i1 = i2), return;% compute upper and lower constraints, L and H, on multiplier a2alpha1 = SMO.alpha(i1);

10、alpha2 = SMO.alpha(i2);x1 = SMO.x(i1); x2 = SMO.x(i2);y1 = SMO.y(i1);C = SMO.C; K = SMO.Kcache;s = y1*y2;if (y1 = y2)L = max(0, alpha2-alpha1); H = min(C, alpha2-alpha1+C);L = max(0, alpha1+alpha2-C); H = min(C, alpha1+alpha2);if (L = H), return;if (alpha1 (alpha1 % e2 = SMO.error(i2);%else% e2 = fw

11、d(i2) - y2;%end;%compute etak11 = K(i1,i1); k12 = K(i1,i2); k22 = K(i2,i2);eta = 2.0*k12-k11-k22;%recompute Lagrange multiplier for pattern i2if (eta 0.0)a2 = alpha2 - y2*(e1 - e2)/eta;%constrain a2 to lie between L and Hif (a2 H)a2 = H;%When eta is not negative, the objective function W should be%e

12、valuated at each end of the line segment. Only those terms in the%objective function that depend on alpha2 need be evaluated. ind = find(SMO.alpha0);aa2 = L; aa1 = alpha1 + s*(alpha2-aa2);Lobj = aa1 + aa2 + sum(-y1*aa1/2).*SMO.y(ind).*K(ind,i1) + (-y2*aa2/2).*SMO.y(ind).*K(ind,i2);aa2 = H;Hobj = aa1

13、 + aa2 + sum(-y1*aa1/2).*SMO.y(ind).*K(ind,i1) + (-y2*aa2/2).*SMO.y(ind).*K(ind,i2);if (LobjHobj+SMO.epsilon)elseif (LobjHobj-SMO.epsilon)a2 = alpha2;if (abs(a2-alpha2) (a1 (a2SMO.bias = b2;SMO.bias = (b1 + b2)/2;% update error cache using new Lagrange multipliersSMO.error = SMO.error + w1*K(:,i1) + w2*K(:,i2) + bold - SMO.bias;SMO.error(i1) = 0.0; SMO.error(i2) = 0.0;% update vector of Lagrange multipliersSMO.alpha(i1) = a1; SMO

copyright@ 2008-2022 冰豆网网站版权所有

经营许可证编号:鄂ICP备2022015515号-1