神经网络英文文献.docx

上传人:b****7 文档编号:9898430 上传时间:2023-02-07 格式:DOCX 页数:13 大小:1.04MB
下载 相关 举报
神经网络英文文献.docx_第1页
第1页 / 共13页
神经网络英文文献.docx_第2页
第2页 / 共13页
神经网络英文文献.docx_第3页
第3页 / 共13页
神经网络英文文献.docx_第4页
第4页 / 共13页
神经网络英文文献.docx_第5页
第5页 / 共13页
点击查看更多>>
下载资源
资源描述

神经网络英文文献.docx

《神经网络英文文献.docx》由会员分享,可在线阅读,更多相关《神经网络英文文献.docx(13页珍藏版)》请在冰豆网上搜索。

神经网络英文文献.docx

神经网络英文文献

ARTIFICIALNEURALNETWORKFORLOADFORECASTING

INSMARTGRID

HAO-TIANZHANG,FANG-YUANXU,LONGZHOU

EnergySystemGroup,CityUniversityLondon,NorthamptonSquare,London,UK

E-MAIL:

,long.zhou.

Abstract:

Itisanirresistibletrendoftheelectricpowerimprovementfordevelopingthesmartgrid,whichappliesalargeamountofnewtechnologiesinpowergeneration,transmission,distributionandutilizationtoachieveoptimizationofthepowerconfigurationandenergysaving.Asoneofthekeylinkstomakeagridsmarter,loadforecastplaysasignificantroleinplanningandoperationinpowersystem.ManywayssuchasExpertSystems,GreySystemTheory,andArtificialNeuralNetwork(ANN)andsoonareemployedintoloadforecasttodothesimulation.ThispaperintendstoillustratetherepresentationoftheANNappliedinloadforecastbasedonpracticalsituationinOntarioProvince,Canada.

Keywords:

Loadforecast;ArtificialNeuronNetwork;backpropagationtraining;Matlab

1.Introduction

Loadforecastingisvitallybeneficialtothepowersystemindustriesinmanyaspects.Asanessentialpartinthesmartgrid,highaccuracyoftheloadforecastingisrequiredtogivetheexactinformationaboutthepowerpurchasingandgenerationinelectricitymarket,preventmoreenergyfromwastingandabusingandmakingtheelectricitypriceinareasonablerangeandsoon.Factorssuchasseasondifferences,climatechanges,weekendsandholidays,disastersandpoliticalreasons,operationscenariosofthepowerplantsandfaultsoccurringonthenetworkleadtochangesoftheloaddemandandgenerations.

Since1990,theartificialneuralnetwork(ANN)hasbeenresearchedtoapplyintoforecastingtheload.“ANNsaremassivelyparallelnetworksofsimpleprocessingelementsdesignedtoemulatethefunctionsandstructureofthebraintosolveverycomplexproblems”.Owingtothetranscendentcharacteristics,ANNsisoneofthemostcompetentmethodstodothepracticalworkslikeloadforecasting.Thispaperconcernsaboutthebehaviorsofartificialneuralnetworkinloadforecasting.AnalysisofthefactorsaffectingtheloaddemandinOntario,Canadaismadetogivean

effectivewayforloadforecastinOntario.

2.BackPropagationNetwork

2.1.Background

Becausetheoutstandingcharacteristicofthestatisticalandmodelingcapabilities,ANNcoulddealwithnon-linearandcomplexproblemsintermsofclassificationorforecasting.Astheproblemdefined,therelationshipbetweentheinputandtargetisnon-linearandverycomplicated.ANNisanappropriatemethodtoapplyintotheproblemtoforecasttheloadsituation.Forapplyingintotheloadforecast,anANNneedstoselectanetworktypesuchasFeed-forwardBackPropagation,LayerRecurrentandFeed-forwardtime-delayandsoon.Todate,Backpropagationiswidelyusedinneuralnetworks,whichisafeed-forwardnetworkwithcontinuouslyvaluedfunctionsandsupervisedlearning.Itcanmatchtheinputdataandcorrespondingoutputinanappropriatewaytoapproachacertainfunctionwhichisusedforachievinganexpectedgoalwithsomepreviousdatainthesamemanneroftheinput.

2.2.Architectureofbackpropagationalgorithm

Figure1showsasingleNeuronmodelofbackpropagationalgorithm.Generally,theoutputisafunctionofthesumofbiasandweightmultipliedbytheinput.Theactivation

functioncouldbeanykindsoffunctions.However,thegeneratedoutputisdifferent.

Owingtothefeed-forwardnetwork,ingeneral,atleastonehiddenlayerbeforetheoutputlayerisneeded.Three-layernetworkisselectedasthearchitecture,becausethiskindofarchitecturecanapproximateanyfunctionwithafewdiscontinuities.ThearchitecturewiththreelayersisshowninFigure2below:

Figure1.Neuronmodelofbackpropagationalgorithm

Figure2.Architectureofthree-layerfeed-forwardnetwork

Basically,therearethreeactivationfunctionsappliedintobackpropagationalgorithm,namely,Log-Sigmoid,Tan-Sigmoid,andLinearTransferFunction.TheoutputrangeineachfunctionisillustratedinFigure3below.

Figure.3.Activationfunctionsappliedinbackpropagation

(a)Log-sigmoid(b)Tan-sigmoid(c)linearfunction

2.3.Trainingfunctionselection

AlgorithmsoftrainingfunctionemployedbasedonbackpropagationapproachareusedandthefunctionwasintegratedintheMatlabNeuronnetworktoolbox.

TABLE.I.TRAININGFUNCTIONSINMATLAB’SNNTOOLBOX

3.TrainingProcedures

3.1.Backgroundanalysis

TheneuralnetworktrainingisbasedontheloaddemandandweatherconditionsinOntarioProvince,CanadawhichislocatedinthesouthofCanada.TheregioninOntariocanbedividedintothreepartswhicharesouthwest,centralandeast,andnorth,accordingtotheweatherconditions.Thepopulationisgatheredaroundsoutheasternpartoftheentireprovince,whichincludestwoofthelargestcitiesofCanada,TorontoandOttawa.

3.2.DataAcquisition

Therequiredtrainingdatacanbedividedintotwoparts:

inputvectorsandoutputtargets.Forloadforecasting,inputvectorsfortrainingincludealltheinformationoffactors

affectingtheloaddemandchange,suchasweatherinformation,holidaysorworkingdays,faultoccurringinthenetworkandsoon.Outputtargetsaretherealtimeload

scenarios,whichmeanthedemandpresentedatthesametimeasinputvectorschanging.

Owingtotheconditionalrestriction,thisstudyonlyconsiderstheweatherinformationandlogicaladjustmentofweekdaysandweekendsasthefactorsaffectingtheload

status.Inthispaper,factorsaffectingtheloadchangingarelistedbelow:

(1).Temperature(℃)

(2).DewPointTemperature(℃)

(3).RelativeHumidity(%)

(4).Windspeed(km/h)

(5).WindDirection(10)

(6).Visibility(km)

(7).Atmosphericpressure(kPa)

(8).Logicaladjustmentofweekdayorweekend

Accordingtotheinformationgatheredabove,theweatherinformationinTorontotakenplaceofthewholeOntarioprovinceischosentoprovidedataacquisition.Thedatawasgatheredhourlyaccordingtothehistoricalweatherconditionsremainedintheweatherstations.Loaddemanddataalsoneedstobegatheredhourlyandcorrespondingly.Inthispaper,2yearsweatherdataandloaddataiscollectedtotrainandtestthecreatednetwork.

3.3.DataNormalization

Owingtopreventthesimulatedneuronsfrombeingdriventoofarintosaturation,allofthegathereddataneedstobenormalizedafteracquisition.Likeperunitsystem,eachinputandtargetdataarerequiredtobedividedbythemaximumabsolutevalueincorrespondingfactor.Eachvalueofthenormalizeddataiswithintherangebetween-1and+1sothattheANNcouldrecognizethedataeasily.Besides,weekdaysarerepresentedas1,andweekendarerepresentedas0.

3.4.Neuralnetworkcreating

ToolboxinMatlabisusedfortrainingandsimulatingtheneuronnetwork.Thelayoutoftheneuralnetworkconsistsofnumberofneuronsandlayers,connectivityoflayers,activationfunctions,anderrorgoalandsoon.Itdependsonthepracticalsituationtosettheframeworkandparametersofthenetwork.ThearchitectureoftheANNcouldbeselectedtoachievetheoptimizedresult.Matlabisoneofthebestsimulation

toolstoprovidevisiblewindows.Three-layerarchitecturehasbeenchosentogivethesimulationasshowninFigure2above.Itisadequatetoapproximatearbitraryfunction,ifthenodesofthehiddenlayeraresufficient.

Duetothepracticalinputvalueisfrom-1to+1,thetransferfunctionofthefirstlayerissettobetansigmiod,whichisahyperbolictangentsigmoidtransferfunction.Thetransferfunctionoftheoutputlayerissettobelinearfunction,whichisalinearfunctiontocalculatealayer’soutputfromitsnetinput.Thereisoneadvantageforthelinearoutputtransferfunction:

becausethelinearoutputneuronsleadtotheoutputtakeonanyvalue,thereisnodifficultytofindoutthedifferencesbetweenoutputandtarget.

Thenextstepistheneuronsandtrainingfunctionsselection.Generally,TrainbrandTrainlmarethebestchoicesaroundallofthetrainingfunctionsinMatlabtoolbox

Trainlm(Levenberg-Marquardtalgorithm)isthefastesttrainingalgorithmfornetworkswithmoderatesize.However,thebigproblemappearsthatitneedsthestorageofsomematriceswhichissometimeslargefortheproblems.Whenthetrainingsetislarge,trainlmalgorithmwillreducethememoryandalwayscomputetheapproximateHessianmatrixwithn×ndimensions.Anotherdrawbackofthetrainlmisthattheover-fittingwilloccurwhenthenumberoftheneuronsistoolarge.Basically,thenumberofneuronsisnottoolargewhenthetrainlmalgorithmisemployedintothenetwork.Trainbr(Bayesianregularization)isamodifiedalgorithmoftheLevenberg-Marquardttrainingmethodtocreatenetworkswhichgeneralizewellsothattheoptimalnetworkarchitecturecanbeeasilydetermined.Impactsfromeffectivelyusedweightsandbiasesofthenetworkcanbeseenclearlybythisalgorithm.Andthenumberoftheeffectiveweightsandbiaseswillnotchangetoomuchwhenthedimensionofthenetworkisgettinglarge.Thetrianbralgorithmhasthebestperformanceafterthenetworkinputandoutputnormalizedintotherangefrom-1to+1.Animportantthingwhenusingtrainbrshouldbementionedisthatthealgorithmshouldnotstopuntiltheeffectivenumberofparametershasconverged.MoredetailsareavailableinMatlabneuralnetworktoolbox.

Numberofneuronsinthefirstlayeralsocanbeselectedtooptimizethenetworksothatanexpectedresultcanbemade.Generallyspeaking,themorecomplicatedarchitectureofthenetworkis,themoreaccuratetheoutputresultwillbe,however,thehigherchanceswillthealgor

展开阅读全文
相关资源
猜你喜欢
相关搜索

当前位置:首页 > 高等教育 > 文学

copyright@ 2008-2022 冰豆网网站版权所有

经营许可证编号:鄂ICP备2022015515号-1