外文翻译萤光灯管检测室内移动机器人.docx
《外文翻译萤光灯管检测室内移动机器人.docx》由会员分享,可在线阅读,更多相关《外文翻译萤光灯管检测室内移动机器人.docx(19页珍藏版)》请在冰豆网上搜索。
![外文翻译萤光灯管检测室内移动机器人.docx](https://file1.bdocx.com/fileroot1/2023-2/9/c4145866-089f-4297-8085-0aa41cb925fc/c4145866-089f-4297-8085-0aa41cb925fc1.gif)
外文翻译萤光灯管检测室内移动机器人
毕业设计外文资料翻译
题目荧光管检测室内移动机器人
专业机械设计制造及其自动化
班级
学生
学号
指导教师
二〇一二年四月八日
AutonomousIndoorMobileRobotNavigationbydetectingFluorescentTubes
FabienLAUNAYAkihisaOHYAShin’ichiYUTA
IntelligentRobotLaboratory,UniversityofTsukuba1-1-1Tennoudai,Tsukuba,Ibaraki305-8573JAPAN{launay,ohya,yuta}@roboken.esys.tsukuba.ac.jp
Abstract
Thispaperproposesanindoornavigationsystemforanautonomousmobilerobotincludingtheteachingofitsenvironment.Theself-localizationofthevehicleisdonebydetectingthepositionandorientationoffluorescenttubeslocatedaboveit’sdesiredpaththankstoacamerapointingtotheceiling.
Amapofthelightsbasedonodometrydataisbuiltinadvancebytherobotguidedbyanoperator.Thenagraphicuserinterfaceisusedtodefinethetrajectorytherobotmustfollowwithrespecttothelights.Whiletherobotismoving,thepositionandorientationofthelightsitdetectsarecomparedtothemapvalues,whichenablesthevehicletocancelodometryerrors.
1Introduction
Whenawheeltypemobilerobotnavigatesonatwodimensionalplane,itcanusesensorstoknowitsrelativelocalizationbysummingelementarydisplacementsprovidedbyincrementalencodersmountedonitswheels.Themaindefaultofthismethodknownasodometryisthatitsestimationerrortendstoincreaseunboundedly[1].Forlongdistancenavigation,odometryandotherdeadreckoningsolutionsmaybesupportedbyanabsolutelocalizationtechniqueprovidingpositioninformationwithalowfrequency.
Absolutelocalizationinindoornavigationusinglandmarkslocatedonthegroundoronthewallsissometimesdifficulttoimplementsincedifferentobjectscanobstructthem.Thereforeanavigationsystembasedonceilinglandmarkrecognitioncanbethoughtasanalternativetothisissue.
Thenavigationsystemwedevelopedconsistsintwosteps.Inthefirststep,thevehicleisprovidedwithamapoftheceilinglights.Buildingsuchamapbyhandquicklybecomesaheavytaskasitssizegrows.Instead,therobotisguidedmanuallyundereachlightandbuildsthemapautomatically.Thesecondstepconsistsindefininganavigationpathforthevehicleandenablingitspositionandorientationcorrectionwheneveritdetectsalightrecordedpreviouslyinthemap.
Sincethemapbuiltbytherobotisbasedonodometrywhoseestimationerrorgrowsunboundedly,thepositionandorientationofthelightsinthemapdonotcorrespondtothereality.However,ifthetrajectorytobefollowedbythevehicleduringthenavigationprocessisdefinedappropriatelyabovethisdistortedmap,itwillbepossiblefortherobottomovealonganydesiredtrajectoryintherealworld.AGUIhasbeendevelopedinordertofacilitatethismap-basedpathdefinitionprocess.
Weequippedamobilerobotwithacamerapointingtotheceiling.Duringthenavigationprocess,whenalightisdetected,therobotcalculatesthepositionandtheorientationofthislandmarkinitsownreferenceandthankstoamapofthelightsbuiltinadvance,itcanestimateitsabsolutepositionandorientationwithrespecttoitsmap.
Wedefinetheposeofanobjectasitspositionandorientationwithrespecttoagivenreferential.
2Relatedwork
Theideaofusinglightsaslandmarksforindoornavigationisnotnew.Hashino[2]developedafluorescentlightsensorinordertodetecttheinclinationanglebetweenanunmannedvehicleandafluorescentlampattachedtotheceiling.Theobjectivewastocarryoutthemainpartoftheprocessbyhardwarelogiccircuit.
Insteadoflights,openingsintheceilingforaerationshavealsobeenusedaslandmarkstotrack.Ootaetal.[3]basedthistrackingonedgedetection,whereasFukuda[4]developedamorecomplexsystemusingfuzzytemplatematching.Hashibaetal.[5]usedthedevelopmentimagesoftheceilingtoproposeamotionplanningmethod.Morerecently,Amatetal.[6]presentedavisionbasednavigationsystemusingseveralfluorescentlighttubeslocatedincapturedimageswhoseabsoluteposeestimationaccuracyisbetterthanaGPSsystem.
Oneadvantageofthesystemproposedhereisitslowmemoryandprocessingspeedrequirementsthatmakeitsimplementationpossibleonarobotwithlimitedimage-processinghardware.Moreover,ournavigationsystemincludesalandmarksmapconstructionprocessentirelybasedontherobot’sodometrydata.ThedevelopmentofaGUIenablestheconnectionbetweenthelightsmapproducedduringtheteachingprocess,andtheautonomousrobotnavigation,whichresultsinacompletenavigationsystem.Thisisthemaindifferencewiththepreviousworkswhicheitherassumetheknowledgeoftheceilinglandmarks’exactposethankstoCADdataofbuildingmaps,orrequiretheabsolutevehicleposetobeenteredmanuallyandperiodicallyduringthelandmarksmapconstruction
soastocancelodometryerrors.
Figure1:
Targetenvironmentconsistingoflightsofdifferentshapesincorridorsexposedtoluminosityvariationsduetosunning.
3Lights’mapbuilding
Inordertocancelodometryerrorswheneveralightisdetected,therobotneedstoknowinadvancetheposeinagivenreferentialofthelightsunderwhichitissupposedtonavigate.
Sinceweareaimingatlongdistanceautonomousindoornavigation,thesizeofthelandmarksmapis
unbounded.Buildingsuchamapmanuallybecomesaheavytaskfortheoperatorandwebelievethatanautonomousmobilerobotcancopewiththisissue.
Duringthelearningprocess,thevehicleequippedwithacamerapointingtotheceilingisguidedmanuallyundereachlightandaddslandmarkinformationtothemapwheneveranewlightappearsaboveitspath.Thishumanassistedmapbuildingisthefirststepofourresearchconcerninglandmarksmapbuilding.Wewanttochangeittoafullyautonomousmapbuildingsystem.Astheimage-processinginvolvedduringthelearningprocessisidenticaltotheoneusedduringthenavigation,wewillpresentthefeatureextractionmethodinsections5and6.
Oncetheteachingphaseiscompleted,therobotholdsamapofthelightsthatcanbeusedlaterfortheautonomousnavigationprocess.
4Dealingwitharobot-mademap
4.1Odometryerror’sinfluenceonthemap
Askingtherobottobuildamapimpliesdealing\withodometryerrorsthatwilloccurduringthelearningprocessitself.Astherobotwillbeguidedundernewlights,becauseoftheaccumulationofodometryerrors,theposeofthelandmarksrecordedinthemapwillbecomemoreandmoredifferentfromthevaluescorrespondingtotherealworld.
SeveralmapsoftheenvironmentrepresentedinFig.1aregiveninFig.2.Theodometrydatarecordedbytherobotduringthelearningprocesshasalsobeenrepresentedforoneofthemaps.
4.2Usageofthemap
Onlyonemapisneededbytherobottocorrectitsposeduringthenavigationprocess.Whenevertherobotdetectsalightlearntpreviously,itcorrectsitsabsolutepose1byusingthelandmark’sinformation
recordedinthemap.Sincethemapcontentsdon’tcorrespondtothevaluesoftherealworld,thetrajectoryoftherobothastobespecifiedaccordingtotheposeofthelightsinthemap,andnotaccordingtothetrajectorywewanttherobottofollowinitsrealenvironment.
Forexample,ifthemobilerobot’staskistonavigaterightbelowastraightcorridor’slights,therobotwon’tberequestedtofollowastraightlinealongthemiddleofthecorridor.Insteadofthissimplemotioncommand,therobotwillhavetotraceeverysegmentwhichconnectstheprojectiononthegroundofthecenteroftwosuccessivelights.ThisisillustratedinFig.3whereazoomofthetrajectoryspecifiedtotherobotappearsindottedline.
AGUIhasbeendevelopedinTcl/Tkinordertospecifyeasilydifferenttypesoftrajectorieswithrespecttothemaplearntbytherobot.ThisGUIcanalsobeusedon-lineinordertofollowtheevolutionoftherobotinrealtimeonthelandmarksmapduringthelearningandnavigationprocesses.
Figure2:
SeveralmapsoftheenvironmentrepresentedFig.1builtbythesamerobot.Rectanglesandcirclesrepresentlightsofdifferentshapes.
5Fluorescenttubedetection
5.1Fluorescenttubemodel
Itisnaturaltothinkoffluorescenttubeasanaturallandmarkforavision-basedprocessaimedatimprovingthelocalizationofamobilerobotinanindoorenvironment.Indeed,problemssuchasdirt,shadows,lightreflectionontheground,orobstructionofthelandmarksusuallydonotappearinthiscase.
Oneadvantageoffluorescenttubescomparedtootherpossiblelandmarkslocatedontheceilingisthatoncetheyareswitchedon,theirrecognitioninanimagecanbeperformedwithaverysimpleimageprocessingalgorithmsincetheyaretheonlybrightelementsthatarepermanentlyfoundinsuchaplace.
Ifa256greylevelsimagecontainingafluorescenttubeisbinarizedwithanappropriatethreshold0≤T≤255,theonlyelementthatremainsafterthisoperationisarectangularshape.Fig.4.ashowsatypicalcameraimageoftheceilingofacorridorcontainingafluorescentlight.Theaxisofthecameraisperpendiculartotheceiling.Shownin(b)isthebinarizedimageof(a).Ifwesupposethatthedistancebetweenthecameraandtheceilingremainsconstantandthatnomorethanonelightatatimecanbeseenbythecameralocatedonthetopoftherobot,afluorescenttubecanbemodeledbyagivenareaS0inathresholdedimageoftheceiling.
Figure4:
(a)Sampleimageofafluorescentlight,(b)binarizedimage.
5.2Fluorescentlightdetectionprocess
Usingodometry,therobotisabletoknowwhenitgetsclosetoalightrecordedinitsmapbycomparinginacloseloopitsactualestimatedpositionto