神经网络英文文献.docx

上传人:b****4 文档编号:12340987 上传时间:2023-04-18 格式:DOCX 页数:14 大小:182.37KB
下载 相关 举报
神经网络英文文献.docx_第1页
第1页 / 共14页
神经网络英文文献.docx_第2页
第2页 / 共14页
神经网络英文文献.docx_第3页
第3页 / 共14页
神经网络英文文献.docx_第4页
第4页 / 共14页
神经网络英文文献.docx_第5页
第5页 / 共14页
点击查看更多>>
下载资源
资源描述

神经网络英文文献.docx

《神经网络英文文献.docx》由会员分享,可在线阅读,更多相关《神经网络英文文献.docx(14页珍藏版)》请在冰豆网上搜索。

神经网络英文文献.docx

神经网络英文文献

ARTIFICIALNEURALNETWORKFORLOADFORECASTING

INSMARTGRID

HAO-TIANZHANG,FANG-YUANXU,LONGZHOU

EnergySystemGroup,CityUniversityLondon,NorthamptonSquare,London,UKE-MAIL:

,long.zhou.

Abstract:

Itisanirresistibletrendoftheelectricpowerimprovementfordevelopingthesmartgrid,whichappliesalargeamountofnewtechnologiesinpowergeneration,transmission,distributionandutilizationtoachieveoptimizationofthepowerconfigurationandenergysaving.Asoneofthekeylinkstomakeagridsmarter,loadforecastplaysasignificantroleinplanningandoperationinpowersystem.ManywayssuchasExpertSystems,GreySystemTheory,andArtificialNeuralNetwork(ANN)andsoonareemployedintoloadforecasttodothesimulation.ThispaperintendstoillustratetherepresentationoftheANNappliedinloadforecastbasedonpracticalsituationinOntarioProvince,Canada.

Keywords:

Loadforecast;ArtificialNeuronNetwork;backpropagationtraining;Matlab

1.Introduction

Loadforecastingisvitallybeneficialtothepowersystemindustriesinmanyaspects.Asanessentialpartinthesmartgrid,highaccuracyoftheloadforecastingisrequiredtogivetheexactinformationaboutthepowerpurchasingandgenerationinelectricitymarket,preventmoreenergyfromwastingandabusingandmakingtheelectricitypriceinareasonablerangeandsoon.Factorssuchasseasondifferences,climatechanges,weekendsandholidays,disastersandpoliticalreasons,operationscenariosofthepowerplantsandfaultsoccurringonthenetworkleadtochangesoftheloaddemandandgenerations.

Since1990,theartificialneuralnetwork(ANN)hasbeenresearchedtoapplyintoforecastingtheload.“ANNsaremassivelyparallelnetworksofsimpleprocessingelementsdesignedtoemulatethefunctionsandstructureofthebraintosolveverycomplexproblems”.Owingtothe

transcendentcharacteristics,ANNsisoneofthemostcompetentmethodstodothepracticalworkslikeloadforecasting.Thispaperconcernsaboutthebehaviorsofartificialneuralnetworkinloadforecasting.AnalysisofthefactorsaffectingtheloaddemandinOntario,CanadaismadetogiveaneffectivewayforloadforecastinOntario.

2.BackPropagationNetwork

2.1.Background

Becausetheoutstandingcharacteristicofthestatisticalandmodelingcapabilities,ANNtoulddealwithnon-linearandcomplexproblemsintermsofclassificationorforecasting.Astheproblemdefined,therelationshipbetweentheinputandtargetisnon-linearandverycomplicated.ANNisanappropriatemethodtoapplyintotheproblemtoforecasttheloadsituation.Forapplyingintotheloadforecast,anANN

needstoselectanetworktypesuchasFeed-forwardBackPropagation,LayerRecurrentandFeed-forwardtime-delayandsoon.Todate,Backpropagationiswidelyusedinneuralnetworks,whichisafeed-forwardnetworkwithcontinuouslyvaluedfunctionsandsupervisedlearning.Itcanmatchtheinputdataandcorrespondingoutputinanappropriatewaytoapproachacertainfunctionwhichisusedforachievinganexpectedgoalwithsomepreviousdatainthesamemanneroftheinput.

2.2.Architectureofbackpropagationalgorithm

Figure1showsasingleNeuronmodelofbackpropagationalgorithm.

Generally,theoutputisafunctionofthesumofbiasandweightmultipliedbytheinput.Theactivationfunctioncouldbeanykindsoffunctions.However,thegeneratedoutputisdifferent.

Owingtothefeed-forwardnetwork,ingeneral,atleastonehiddenlayerbeforetheoutputlayerisneeded.Three-layernetworkisselectedasthearchitecture,becausethiskindofarchitecturecanapproximateanyfunctionwithafewdiscontinuities.ThearchitecturewiththreelayersisshowninFigure2below:

Figure1.Neuronmodelofbackpropagationalgorithm

Figure2.Architectureofthree-layerfeed-forwardnetwork

Basically,therearethreeactivationfunctionsappliedintobackpropagationalgorithm,namely,Log-Sigmoid,Tan-Sigmoid,andLinearTransferFunction.TheoutputrangeineachfunctionisillustratedinFigure3below.

Figure.3.Activationfunctionsappliedinbackpropagation(a)Log-sigmoid(b)Tan-sigmoid(c)linearfunction

2.3.Trainingfunctionselection

AlgorithmsoftrainingfunctionemployedbasedonbackpropagationapproachareusedandthefunctionwasintegratedintheMatlabNeuronnetworktoolbox.

Functionname

Algorkhtn

trainb

Batchtrainingwithweighl&biaslearningrules

irainbfg

BFGSquasi-Xewtonbackpropagation

trainbr

Bayesianlugularization

trainc

Cyclicalorderincrementaltrainingw/learningfunclions

iraincgb

Powell-Bealeconjugategradientbackpropagation

intinegf

FlctchepPowcllcoiijugaiL'pradicnthackpropagaiion

[raincgp

Polak-Ribiervconjugategradientbackpjopagatiort

traingd

Gradientdescentbackpropagaiioji

miingdm

Gradientduscentwithmoineniuinbackpropagalion

iraingda

GradienidescentwithadaptiveIrbackpropagation

iniingdx

GradientLlescentw/tnomcimun&adaptivekbackpropagaiion

train]m

Lcvenberg-Marquard【backpropagation

irainoss

Onestepsecantbackpropagaiion

trainr

RandomorderiiKreineiilaltrainingu/iearnin^funclions

irainrp

ResidentbackpropagationtRprop)

trains

Scquetuialorderinccementaltrainingw/leamingfunctions

irainscg

Scaledconjugategradientbackpropagation

TABLE」.TRAININGFUNCTIONSINMATLAB'SNNTOOLBOX

3.TrainingProcedures

3.1.Backgroundanalysis

TheneuralnetworktrainingisbasedontheloaddemandandweatherconditionsinOntarioProvince,CanadawhichislocatedinthesouthofCanada.TheregioninOntariocanbedividedintothreepartswhicharesouthwest,centralandeast,andnorth,accordingtotheweatherconditions.Thepopulationisgatheredaroundsoutheasternpartoftheentireprovince,whichincludestwoofthelargestcitiesofCanada,TorontoandOttawa.

3.2.DataAcquisition

Therequiredtrainingdatacanbedividedintotwoparts:

inputvectorsandoutputtargets.Forloadforecasting,inputvectorsfortrainingincludealltheinformationoffactorsaffectingtheloaddemandchange,suchasweatherinformation,holidaysorworkingdays,faultoccurringinthenetworkandsoon.Outputtargetsaretherealtimeloadscenarios,whichmeanthedemandpresentedatthesametimeasinputvectorschanging.

Owingtotheconditionalrestriction,thisstudyonlyconsiderstheweatherinformationandlogicaladjustmentofweekdaysandweekendsasthefactorsaffectingtheloadstatus.Inthispaper,factorsaffectingtheloadchangingarelistedbelow:

(1).Temperature(°C)

(2).DewPointTemperature(C)

(3).RelativeHumidity(%)

(4).Windspeed(km/h)

(5).WindDirection(10)

(6).Visibility(km)

(7).Atmosphericpressure(kPa)

(8).Logicaladjustmentofweekdayorweekend

Accordingtotheinformationgatheredabove,theweatherinformationinTorontotakenplaceofthewholeOntarioprovinceischosentoprovidedataacquisition.Thedatawasgatheredhourlyaccordingtothehistoricalweatherconditionsremainedintheweatherstations.Loaddemanddataalsoneedstobegatheredhourlyandcorrespondingly.Inthispaper,2yearsweatherdataandloaddataiscollectedtotrainandtestthecreatednetwork.

3.3.DataNormalization

Owingtopreventthesimulatedneuronsfrombeingdriventoofarintosaturation,allofthegathereddataneedstobenormalizedafteracquisition.Likeperunitsystem,eachinputandtargetdataarerequiredtobedividedbythemaximumabsolutevalueincorrespondingfactor.Eachvalueofthenormalizeddataiswithintherangebetween-1and+1sothattheANNcouldrecognizethedataeasily.Besides,weekdaysarerepresentedas1,andweekendarerepresentedas0.

3.4.Neuralnetworkcreating

ToolboxinMatlabisusedfortrainingandsimulatingtheneuronnetwork.Thelayoutoftheneuralnetworkconsistsofnumberofneuronsandlayers,connectivityoflayers,activationfunctions,anderrorgoalandsoon.Itdependsonthepracticalsituationtosettheframeworkandparametersofthenetwork.ThearchitectureoftheANNcouldbeselectedtoachievetheoptimizedresult.Matlabisoneofthebestsimulationtoolstoprovidevisiblewindows.Three-layerarchitecturehasbeenchosentogivethesimulationasshowninFigure2above.Itisadequatetoapproximatearbitraryfunction,ifthenodesofthehiddenlayeraresufficient.

Duetothepracticalinputvalueisfrom-1to+1,thetransferfunctionofthefirstlayerissettobetansigmiod,whichisahyperbolictangentsigmoidtransferfunction.Thetransferfunctionoftheoutputlayerissettobelinearfunction,whichisalinearfunctiontocalculatealayer'soutputfromitsnetinput.Thereisoneadvantageforthelinearoutputtransferfunction:

becausethelinearoutputneuronsleadtotheoutputtakeonanyvalue,thereisnodifficultytofindoutthedifferencesbetweenoutputandtarget.

Thenextstepistheneuronsandtrainingfunctionsselection.

Generally,TrainbrandTrainlmarethebestchoicesaroundallofthetrainingfunctionsinMatlabtoolbox

Trainlm(Levenberg-Marquardtalgorithm)isthefastesttrainingalgorithmfornetworkswithmoderatesize.However,thebigproblem

appearsthatitneedsthestorageofsomematriceswhichissometimeslargefortheproblems.Whenthetrainingsetislarge,trainlmalgorithmwillreducethememoryandalwayscomputetheapproximateHessianmatrixwithnxndimensions.AnotherdrawbackofthetrainImisthattheover-fittingwilloccurwhenthenumberoftheneuronsistoolarge.Basically,thenumberofneuronsisnottooIargewhenthetrainImaIgorithmisempIoyedintothenetwork.Trainbr(Ba

展开阅读全文
相关资源
猜你喜欢
相关搜索

当前位置:首页 > IT计算机 > 电脑基础知识

copyright@ 2008-2022 冰豆网网站版权所有

经营许可证编号:鄂ICP备2022015515号-1