System Identification with BP Neural Network.docx

上传人:b****6 文档编号:6358209 上传时间:2023-01-05 格式:DOCX 页数:16 大小:680.68KB
下载 相关 举报
System Identification with BP Neural Network.docx_第1页
第1页 / 共16页
System Identification with BP Neural Network.docx_第2页
第2页 / 共16页
System Identification with BP Neural Network.docx_第3页
第3页 / 共16页
System Identification with BP Neural Network.docx_第4页
第4页 / 共16页
System Identification with BP Neural Network.docx_第5页
第5页 / 共16页
点击查看更多>>
下载资源
资源描述

System Identification with BP Neural Network.docx

《System Identification with BP Neural Network.docx》由会员分享,可在线阅读,更多相关《System Identification with BP Neural Network.docx(16页珍藏版)》请在冰豆网上搜索。

System Identification with BP Neural Network.docx

SystemIdentificationwithBPNeuralNetwork

SystemIdentificationwithBPNeuralNetwork

Abstract:

ThispaperfirstlyintroducedMLP(themultilayerperceptron)andBackPropagationalgorithm,andpresentamethodofusingaBP(Back-Propagation)NeuralNetworktorealizesystemidentification.Thenthepapershowedhowitwasappliedtothreesystemsofdifferentsystemfunctions,andanalyzedtheaffectsofdifferentparametersoftheBPneuralnetwork.

Keywords:

MLP,Neurons,HiddenLayer,BPNeuralNetwork.

1IntroductionofMLP

ArtificialNeuralNetworks(ANNs),orsimplyneuralnetworks,areabranchofartificialintelligence.ANNsareprogrammingparadigmthatseektoemulatethemicrostructureofthebrain,andareusedextensivelyinartificialintelligenceproblemsfromsimplepattern-recognitiontasks,toadvancedsymbolicmanipulation.

TheMultilayerPerceptron(MLP)isanexampleofartificialneuralnetworks,whichisusedextensivelyforthesolutionofanumberofdifferentproblems,includingpatternrecognitionandinterpolation.ItisadevelopmentofthePerceptronneuralnetworkmodel,thatwasoriginallydevelopedintheearly1960sbutfoundtohaveseriouslimitations.

ArtificialNeuralNetworksattempttomodelthefunctioningofthehumanbrain.Thehumanbrainforexampleconsistsofbillionsofindividualcellscalledneurons.Itisbelievedthatallknowledgeandexperienceisencodedbytheconnectionsthatexistbetweenneurons.Giventhatthehumanbrainconsistsofsuchalargenumberofneurons(toomanytocountthemwithanycertainty),thequantityandnatureoftheconnectionsbetweenneuronsis,atpresentlevelsofunderstanding,almostimpossibletoassess.

Multilayerperceptronsformonetypeofneuralnetworkasillustratedinthetaxonomyin Fig.0.1. 

Fig.0.1Ataxonomyofneuralnetworkarchitectures

Themultilayerperceptronconsistsofasystemofsimpleinterconnectedneurons,ornodes,asillustratedin Fig.0.2,whichisamodelrepresentingnonlinearmappingbetweenaninputvectorandanoutputvector.

Fig.0.2Amultilayerperceptronwithtwohiddenlayers

Inthefigure0.2,

=[

]=outputvector.Thenodesareconnectedbyweightsandoutputsignalswhichareafunctionofthesumoftheinputstothenodemodifiedbyasimplenonlineartransfer,oractivationfunction.Itisthesuperpositionofmanysimplenonlineartransferfunctionsthatenablesthemultilayerperceptrontoapproximateextremelynon-linearfunctions.Ifthetransferfunctionwaslinearthenthemultilayerperceptronwouldonlybeabletomodellinearfunctions.Duetoitseasilycomputedderivativeacommonlyusedtransferfunctionisthelogisticfunctionasfollows,

asshownin Fig.0.3.Theoutputofanodeisscaledbytheconnectingweightandfedforwardtobeaninputtothenodesinthenextlayerofthenetwork.Thisimpliesadirectionofinformationprocessing,sothemultilayerperceptronisknownasafeed-forwardneuralnetwork.Thearchitectureofamultilayerperceptronisvariablebutingeneralwillconsistofseverallayersofneurons.Theinputlayerplaysnocomputationalrolebutmerelyservestopasstheinputvectortothenetwork.Thetermsinputandoutputvectorsrefertotheinputsandoutputsofthemultilayerperceptronandcanberepresentedassinglevectors,asshownin Fig.0.2.Amultilayerperceptronmayhaveoneormorehiddenlayersandfinallyanoutputlayer.Multilayerperceptronsaredescribedasbeingfullyconnected,witheachnodeconnectedtoeverynodeinthenextandpreviouslayer.

Fig.0.3Thelogisticfunctiony

Byselectingasuitablesetofconnectingweightsandtransferfunctions,ithasbeenshownthatamultilayerperceptroncanapproximateanysmooth,measurablefunctionbetweentheinputandoutputvectors.Multilayerperceptronshavetheabilitytolearnthroughtraining.Trainingrequiresasetoftrainingdata,whichconsistsofaseriesofinputandassociatedoutputvectors.Duringtrainingthemultilayerperceptronisrepeatedlypresentedwiththetrainingdataandtheweightsinthenetworkareadjusteduntilthedesiredinput–outputmappingoccurs.Multilayerperceptronslearninasupervisedmanner.Duringtrainingtheoutputfromthemultilayerperceptron,foragiveninputvector,maynotequalthedesiredoutput.Anerrorsignalisdefinedasthedifferencebetweenthedesiredandactualoutput.Trainingusesthemagnitudeofthiserrorsignaltodeterminetowhatdegreetheweightsinthenetworkshouldbeadjustedsothattheoverallerrorofthemultilayerperceptronisreduced.Therearemanyalgorithmsthatcanbeusedtotrainamultilayerperceptron.Oncetrainedwithsuitablyrepresentativetrainingdatathemultilayerperceptroncangeneralizetonew,unseeninputdata.

Themultilayerperceptronhasbeenappliedtoawidevarietyoftasks,allofwhichcanbecategorizedasprediction,functionapproximation,orpatternclassification.Predictioninvolvestheforecastingoffuturetrendsinatimeseriesofdatagivencurrentandpreviousconditions.Functionapproximationisconcernedwithmodelingtherelationshipbetweenvariables.Patternclassificationinvolvesclassifyingdataintodiscreteclasses.

2BackPropagationalgorithm

Trainingamultilayerperceptronistheprocedurebywhichthevaluesfortheindividualweightsaredeterminedsuchthattherelationshipthenetworkismodelingisaccuratelyresolved.Atthispointwewillconsiderasimplemultilayerperceptronthatcontainsonlytwoweights.Foranycombinationofweightsthenetworkerrorforagivenpatterncanbedefined.Byvaryingtheweightsthroughallpossiblevalues,andbyplottingerrorsinthree-dimensionalspace,weendupwithaplotliketheoneshownin Fig.1.1.Suchasurfaceisknownasanerrorsurface.Theobjectiveoftrainingistofindthecombinationofweightswhichresultinthesmallesterror.Inpractice,itisnotpossibletoplotsuchasurfaceduetothemultitudeofweights.Whatisrequiredisamethodtofindtheminimumpointoftheerrorsurface.

Fig.1.1Anerrorsurfaceforasimplemultilayerperceptroncontainingonlytwoweights.

Onepossibletechniqueistouseaprocedureknownasgradientdescent.Thebackpropagationtrainingalgorithmusesthisproceduretoattempttolocatetheabsolute(orglobal)minimumoftheerrorsurface.Thebackpropagationalgorithmisthemostcomputationallystraightforwardalgorithmfortrainingthemultilayerperceptron.Backpropagationhasbeenshowntoperformadequatelyinmanyapplications;themajorityoftheapplicationsdiscussedinthispaperusedbackpropagationtotrainthemultilayerperceptrons.Backpropagationonlyreferstothetrainingalgorithmandisnotanothertermforthemultilayerperceptronorfeed-forwardneuralnetworks,asiscommonlyreported.

Theweightsinthenetworkareinitiallysettosmallrandomvalues.Thisissynonymouswithselectingarandompointontheerrorsurface.Thebackpropagationalgorithmthencalculatesthelocalgradientoftheerrorsurfaceandchangestheweightsinthedirectionofsteepestlocalgradient.Givenareasonablysmootherrorsurface,itishopedthattheweightswillconvergetotheglobalminimumoftheerrorsurface.

Thebackpropagationalgorithmissummarizedbelow:

Theerrorsurfacein Fig.1.1 containsmorethanoneminimum.Itisdesirablethatthetrainingalgorithmdoesnotbecometrappedinalocalminimum.Thebackpropagationalgorithmcontainstwoadjustableparameters,alearningrateandamomentumterm,whichcanassistthetrainingprocessinavoidingthis.Thelearningratedeterminesthestepsizetakenduringtheiterativegradientdescentlearningprocess.Ifthisistoolargethenthenetworkerrorwillchangeerraticallyduetolargeweightchanges,withthepossibilityofjumpingovertheglobalminima.Conversely,ifthelearningrateistoosmallthentrainingwilltakealongtime.Themomentumtermisusedtoassistthegradientdescentprocessifitbecomesstuckinalocalminimum.Byaddingaproportionofthepreviousweightchangetothecurrentweightchange(whichwillbeverysmallinalocalminimum)itispossiblethattheweightscanescapethelocalminimum.

TheBPalgorithmisdeducedfromthesteepestgradientdescentmethod.Forthe

sample,wedefinethepowerfunctionas

isthedesiredoutputoftheqthsample;

istherealoutputofthenetwork.

Accordingtothesteepestgradientdescentmethod,wecangettheadjustmentoftheweightofeachconnectionasfollows:

[k+1]=

[k]+

Fortheoutputlayer,

Forthehiddenandinputlayer,

Intheformulasabove,f(s)istheactivationfunction,and

isthederivativeoff(s),andsisequaltothedifferencebetweentheweightedsumoftheinputsandthethresholdofeachneuron.

isthelearningrate.

Turntothethresholdofeachneuron,wecanconcludethesimilarformulaasfollows:

Whenthenetworkhasbeentrainedwithallthesamplesforonetime,thealgorithmwouldfinishoneepoch.Thencalculatetheperformanceindex

.Iftheindexfittheaccuracyrequirements,thenendthetraining,elsestartanothertrainingepoch.

3Exercise

(1)

9trainingsamplesuniformlydistributedintheregionof

and361testsamples.

IchooseaMLPmodelwith1hiddenlayer,andweapplieddifferentnumberofneuronsofthehiddenlayertostudytheeffectsofthenumberofneurons.

Ichose9setsofuniformdatatobeusedtotrainthenetwork,andthentestedthenetworkwith361setsofuniformdata.ChooseMatlabasthesimulatingtool.PerformanceindexissetasE<0.01.

Duetotheexistenceofzerosinthedesiredoutput,therelativeerrorwillbehugeintheareanearbythezeros,andthatwillmaketherelativeerroru

展开阅读全文
相关资源
猜你喜欢
相关搜索

当前位置:首页 > 表格模板 > 合同协议

copyright@ 2008-2022 冰豆网网站版权所有

经营许可证编号:鄂ICP备2022015515号-1