神经网络英文文献.docx
《神经网络英文文献.docx》由会员分享,可在线阅读,更多相关《神经网络英文文献.docx(14页珍藏版)》请在冰豆网上搜索。
神经网络英文文献
ARTIFICIALNEURALNETWORKFORLOADFORECASTING
INSMARTGRID
HAO-TIANZHANG,FANG-YUANXU,LONGZHOU
EnergySystemGroup,CityUniversityLondon,NorthamptonSquare,London,UKE-MAIL:
,long.zhou.
Abstract:
Itisanirresistibletrendoftheelectricpowerimprovementfordevelopingthesmartgrid,whichappliesalargeamountofnewtechnologiesinpowergeneration,transmission,distributionandutilizationtoachieveoptimizationofthepowerconfigurationandenergysaving.Asoneofthekeylinkstomakeagridsmarter,loadforecastplaysasignificantroleinplanningandoperationinpowersystem.ManywayssuchasExpertSystems,GreySystemTheory,andArtificialNeuralNetwork(ANN)andsoonareemployedintoloadforecasttodothesimulation.ThispaperintendstoillustratetherepresentationoftheANNappliedinloadforecastbasedonpracticalsituationinOntarioProvince,Canada.
Keywords:
Loadforecast;ArtificialNeuronNetwork;backpropagationtraining;Matlab
1.Introduction
Loadforecastingisvitallybeneficialtothepowersystemindustriesinmanyaspects.Asanessentialpartinthesmartgrid,highaccuracyoftheloadforecastingisrequiredtogivetheexactinformationaboutthepowerpurchasingandgenerationinelectricitymarket,preventmoreenergyfromwastingandabusingandmakingtheelectricitypriceinareasonablerangeandsoon.Factorssuchasseasondifferences,climatechanges,weekendsandholidays,disastersandpoliticalreasons,operationscenariosofthepowerplantsandfaultsoccurringonthenetworkleadtochangesoftheloaddemandandgenerations.
Since1990,theartificialneuralnetwork(ANN)hasbeenresearchedtoapplyintoforecastingtheload.“ANNsaremassivelyparallelnetworksofsimpleprocessingelementsdesignedtoemulatethefunctionsandstructureofthebraintosolveverycomplexproblems”.Owingtothe
transcendentcharacteristics,ANNsisoneofthemostcompetentmethodstodothepracticalworkslikeloadforecasting.Thispaperconcernsaboutthebehaviorsofartificialneuralnetworkinloadforecasting.AnalysisofthefactorsaffectingtheloaddemandinOntario,CanadaismadetogiveaneffectivewayforloadforecastinOntario.
2.BackPropagationNetwork
2.1.Background
Becausetheoutstandingcharacteristicofthestatisticalandmodelingcapabilities,ANNtoulddealwithnon-linearandcomplexproblemsintermsofclassificationorforecasting.Astheproblemdefined,therelationshipbetweentheinputandtargetisnon-linearandverycomplicated.ANNisanappropriatemethodtoapplyintotheproblemtoforecasttheloadsituation.Forapplyingintotheloadforecast,anANN
needstoselectanetworktypesuchasFeed-forwardBackPropagation,LayerRecurrentandFeed-forwardtime-delayandsoon.Todate,Backpropagationiswidelyusedinneuralnetworks,whichisafeed-forwardnetworkwithcontinuouslyvaluedfunctionsandsupervisedlearning.Itcanmatchtheinputdataandcorrespondingoutputinanappropriatewaytoapproachacertainfunctionwhichisusedforachievinganexpectedgoalwithsomepreviousdatainthesamemanneroftheinput.
2.2.Architectureofbackpropagationalgorithm
Figure1showsasingleNeuronmodelofbackpropagationalgorithm.
Generally,theoutputisafunctionofthesumofbiasandweightmultipliedbytheinput.Theactivationfunctioncouldbeanykindsoffunctions.However,thegeneratedoutputisdifferent.
Owingtothefeed-forwardnetwork,ingeneral,atleastonehiddenlayerbeforetheoutputlayerisneeded.Three-layernetworkisselectedasthearchitecture,becausethiskindofarchitecturecanapproximateanyfunctionwithafewdiscontinuities.ThearchitecturewiththreelayersisshowninFigure2below:
Figure1.Neuronmodelofbackpropagationalgorithm
Figure2.Architectureofthree-layerfeed-forwardnetwork
Basically,therearethreeactivationfunctionsappliedintobackpropagationalgorithm,namely,Log-Sigmoid,Tan-Sigmoid,andLinearTransferFunction.TheoutputrangeineachfunctionisillustratedinFigure3below.
Figure.3.Activationfunctionsappliedinbackpropagation(a)Log-sigmoid(b)Tan-sigmoid(c)linearfunction
2.3.Trainingfunctionselection
AlgorithmsoftrainingfunctionemployedbasedonbackpropagationapproachareusedandthefunctionwasintegratedintheMatlabNeuronnetworktoolbox.
Functionname
Algorkhtn
trainb
Batchtrainingwithweighl&biaslearningrules
irainbfg
BFGSquasi-Xewtonbackpropagation
trainbr
Bayesianlugularization
trainc
Cyclicalorderincrementaltrainingw/learningfunclions
iraincgb
Powell-Bealeconjugategradientbackpropagation
intinegf
FlctchepPowcllcoiijugaiL'pradicnthackpropagaiion
[raincgp
Polak-Ribiervconjugategradientbackpjopagatiort
traingd
Gradientdescentbackpropagaiioji
miingdm
Gradientduscentwithmoineniuinbackpropagalion
iraingda
GradienidescentwithadaptiveIrbackpropagation
iniingdx
GradientLlescentw/tnomcimun&adaptivekbackpropagaiion
train]m
Lcvenberg-Marquard【backpropagation
irainoss
Onestepsecantbackpropagaiion
trainr
RandomorderiiKreineiilaltrainingu/iearnin^funclions
irainrp
ResidentbackpropagationtRprop)
trains
Scquetuialorderinccementaltrainingw/leamingfunctions
irainscg
Scaledconjugategradientbackpropagation
TABLE」.TRAININGFUNCTIONSINMATLAB'SNNTOOLBOX
3.TrainingProcedures
3.1.Backgroundanalysis
TheneuralnetworktrainingisbasedontheloaddemandandweatherconditionsinOntarioProvince,CanadawhichislocatedinthesouthofCanada.TheregioninOntariocanbedividedintothreepartswhicharesouthwest,centralandeast,andnorth,accordingtotheweatherconditions.Thepopulationisgatheredaroundsoutheasternpartoftheentireprovince,whichincludestwoofthelargestcitiesofCanada,TorontoandOttawa.
3.2.DataAcquisition
Therequiredtrainingdatacanbedividedintotwoparts:
inputvectorsandoutputtargets.Forloadforecasting,inputvectorsfortrainingincludealltheinformationoffactorsaffectingtheloaddemandchange,suchasweatherinformation,holidaysorworkingdays,faultoccurringinthenetworkandsoon.Outputtargetsaretherealtimeloadscenarios,whichmeanthedemandpresentedatthesametimeasinputvectorschanging.
Owingtotheconditionalrestriction,thisstudyonlyconsiderstheweatherinformationandlogicaladjustmentofweekdaysandweekendsasthefactorsaffectingtheloadstatus.Inthispaper,factorsaffectingtheloadchangingarelistedbelow:
(1).Temperature(°C)
(2).DewPointTemperature(C)
(3).RelativeHumidity(%)
(4).Windspeed(km/h)
(5).WindDirection(10)
(6).Visibility(km)
(7).Atmosphericpressure(kPa)
(8).Logicaladjustmentofweekdayorweekend
Accordingtotheinformationgatheredabove,theweatherinformationinTorontotakenplaceofthewholeOntarioprovinceischosentoprovidedataacquisition.Thedatawasgatheredhourlyaccordingtothehistoricalweatherconditionsremainedintheweatherstations.Loaddemanddataalsoneedstobegatheredhourlyandcorrespondingly.Inthispaper,2yearsweatherdataandloaddataiscollectedtotrainandtestthecreatednetwork.
3.3.DataNormalization
Owingtopreventthesimulatedneuronsfrombeingdriventoofarintosaturation,allofthegathereddataneedstobenormalizedafteracquisition.Likeperunitsystem,eachinputandtargetdataarerequiredtobedividedbythemaximumabsolutevalueincorrespondingfactor.Eachvalueofthenormalizeddataiswithintherangebetween-1and+1sothattheANNcouldrecognizethedataeasily.Besides,weekdaysarerepresentedas1,andweekendarerepresentedas0.
3.4.Neuralnetworkcreating
ToolboxinMatlabisusedfortrainingandsimulatingtheneuronnetwork.Thelayoutoftheneuralnetworkconsistsofnumberofneuronsandlayers,connectivityoflayers,activationfunctions,anderrorgoalandsoon.Itdependsonthepracticalsituationtosettheframeworkandparametersofthenetwork.ThearchitectureoftheANNcouldbeselectedtoachievetheoptimizedresult.Matlabisoneofthebestsimulationtoolstoprovidevisiblewindows.Three-layerarchitecturehasbeenchosentogivethesimulationasshowninFigure2above.Itisadequatetoapproximatearbitraryfunction,ifthenodesofthehiddenlayeraresufficient.
Duetothepracticalinputvalueisfrom-1to+1,thetransferfunctionofthefirstlayerissettobetansigmiod,whichisahyperbolictangentsigmoidtransferfunction.Thetransferfunctionoftheoutputlayerissettobelinearfunction,whichisalinearfunctiontocalculatealayer'soutputfromitsnetinput.Thereisoneadvantageforthelinearoutputtransferfunction:
becausethelinearoutputneuronsleadtotheoutputtakeonanyvalue,thereisnodifficultytofindoutthedifferencesbetweenoutputandtarget.
Thenextstepistheneuronsandtrainingfunctionsselection.
Generally,TrainbrandTrainlmarethebestchoicesaroundallofthetrainingfunctionsinMatlabtoolbox
Trainlm(Levenberg-Marquardtalgorithm)isthefastesttrainingalgorithmfornetworkswithmoderatesize.However,thebigproblem
appearsthatitneedsthestorageofsomematriceswhichissometimeslargefortheproblems.Whenthetrainingsetislarge,trainlmalgorithmwillreducethememoryandalwayscomputetheapproximateHessianmatrixwithnxndimensions.AnotherdrawbackofthetrainImisthattheover-fittingwilloccurwhenthenumberoftheneuronsistoolarge.Basically,thenumberofneuronsisnottooIargewhenthetrainImaIgorithmisempIoyedintothenetwork.Trainbr(Ba