ImageVerifierCode 换一换
格式:PDF , 页数:3 ,大小:156.03KB ,
资源ID:3175680      下载积分:3 金币
快捷下载
登录下载
邮箱/手机:
温馨提示:
快捷下载时,用户名和密码都是您填写的邮箱或者手机号,方便查询和重复下载(系统自动生成)。 如填写123,账号就是123,密码也是123。
特别说明:
请自助下载,系统不会自动发送文件的哦; 如果您已付费,想二次下载,请登录后访问:我的下载记录
支付方式: 支付宝    微信支付   
验证码:   换一换

加入VIP,免费下载
 

温馨提示:由于个人手机设置不同,如果发现不能下载,请复制以下地址【https://www.bdocx.com/down/3175680.html】到电脑端继续下载(重复下载不扣费)。

已注册用户请登录:
账号:
密码:
验证码:   换一换
  忘记密码?
三方登录: 微信登录   QQ登录  

下载须知

1: 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。
2: 试题试卷类文档,如果标题没有明确说明有答案则都视为没有答案,请知晓。
3: 文件的所有权益归上传用户所有。
4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
5. 本站仅提供交流平台,并不能对任何下载内容负责。
6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

版权提示 | 免责声明

本文(上海交通大学神经网络原理与应用作业1.pdf)为本站会员(b****1)主动上传,冰豆网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即通知冰豆网(发送邮件至service@bdocx.com或直接QQ联系客服),我们立即给予删除!

上海交通大学神经网络原理与应用作业1.pdf

1、Neural Network Theory and ApplicationsHomework Assignment 1oxstarSJTUJanuary 19,2012Problem oneOne variation of the perceptron learning rule isWnew=Wold+epTbnew=bold+ewhere is called the learning rate.Prove convergence of this algorithm.Does the proofrequire a limit on the learning rate?Explain.Proo

2、f.We will combine the presentation of weight matrix and the bias into a single vector:x=?Wb?,zq=?pq1?.(1)So the net input and the perceptron learning rule can be written as:WTp+b=xTz,(2)xnew=xold+ez.(3)We only take those iterations for which the weight vector is changed in to account,so thelearning

3、rule becomes(WLOG,assume that x(0)=0)x(k)=x(k 1)+dz(k 1)(4)=dz(0)+dz(1)+.+dz(k 1)(5)where dz zQ,.,z1,z1,.,zQ.Assume that the correct weight vector is x,wecan say,xTdz 0(6)From Equation 5 and Equation 6 we can show thatxTx(k)=xTdz(0)+xTdz(1)+.+xTdz(k 1)(7)k(8)From the Cauchy-Schwartz inequality we ha

4、ve|x|2|x(k)|2(xTx(k)2(k)2(9)1The weights will be updated if the previous weights were incorrect,so we have xTdz 0and|x(k)|2=xT(k)x(k)(10)=x(k 1)+dz(k 1)Tx(k 1)+dz(k 1)(11)=|x(k 1)|2+2xT(k 1)dz(k 1)+2|dz(k 1)|2(12)|x(k 1)|2+2|dz(k 1)|2(13)2|dz(0)|2+.+2|dz(k 1)|2(14)2kmax(|dz|2)(15)From Equation 9 and

5、 Equation 15 we have(k)2|x|2|x(k)|2|x|22kmax(|dz|2)(16)k|x|2max(|dz|2)2(17)Now we have found that the proof does not require a limit on the learning rate for the reasonthat k has no relations with the learning rate.Intuitively,this problem equals to all inputsbeing multiplied by the learning rate(ez

6、=e(z).The cost will not change a lot to findthe correct boundary to proportional data.Problem twoWe have a classification problem with three classes of input vectors.The three classes areclass1:?p1=?11?,p2=?02?,p3=?31?class2:?p4=?21?,p5=?20?,p6=?12?class3:?p7=?12?,p8=?21?,p9=?11?Implement the percep

7、tron network based on the learning rule of problem one to solvethis problem.Run your problem at different learning rate(=1,0:8,0:5,0:3,0:1),compareand discuss the results.Ans.In my experiment,the learning rates are set form 0.1 to 1 with step=0.05.Here Ishow the times of iteration at different learn

8、ing rates(Figure 1).Note that this perceptronalgorithm can only handle two-class problems,so I use two phases of classification to solvethe three-class problem.I can hardly find the relationships between the learning rates and times of iteration.Justas what have been proved above,the cost will not b

9、e influenced a lot to find the correctboundary at different learning rates.I also proved the results for classification at learningrate=1(Figure 2).Problem threeFor the problem XOR as follows,class1:?p1=?10?,p2=?01?class2:?p3=?00?,p4=?11?20.10.20.30.40.50.60.70.80.91051015202530Learning ratesTimes o

10、f learning Classification 1Classification 2Figure 1:Times of Iteration atDifferent Learning Rates21.510.500.511.522.53302520151050510 Class 1Class 2Class 3Figure 2:Result for Classification(P2)0.500.511.53.532.521.510.500.51 Class 1Class 2Figure 3:Result for Classification(XOR Problem)43210123454321

11、01234 Class 1Class 2Figure 4:Result for Classification(Two Spiral Problem)and the two spiral problem delivered as the material,could the perceptron algorithm cor-rectly classify these two problems?If not,explain why.Ans.Just as the results(Figure 3 and Figure 4)show,the perceptron algorithm cannot correctly classify these two problems.As we know,the single-layer perceptrons can onlyclassify linearly separable vectors while the vectors in these two problems are just linearlyinseparable.3

copyright@ 2008-2022 冰豆网网站版权所有

经营许可证编号:鄂ICP备2022015515号-1