Engineering5(2024)164–172Contents lists available at ScienceDirectEngineeringResearch
Robotics—Article
EnhancedAutonomousExplorationandMappingofanUnknownEnvironmentwiththeFusionofDualRGB-DSensors
NingboYua,b,c,?,ShirongWanga,baInstituteofRoboticsandAutomaticInformationSystems,NankaiUniversity,Tianjin300353,ChinaTianjinKeyLaboratoryofIntelligentRobotics,NankaiUniversity,Tianjin300353,ChinacStateKeyLaboratoryofRobotics,ShenyangInstituteofAutomation,ChineseAcademyofSciences,Shenyang110016,Chinabarticleinfoabstract
The autonomous exploration and mapping of an unknown environment is useful in a wide range of appli-cations and thus holds great signi?cance. Existing methods mostly use range sensors to generate two-dimensional (2D) grid maps. Red/green/blue-depth (RGB-D) sensors provide both color and depth infor-mation on the environment, thereby enabling the generation of a three-dimensional (3D) point cloud map that is intuitive for human perception. In this paper, we present a systematic approach with dual RGB-D sensors to achieve the autonomous exploration and mapping of an unknown indoor environment. With the synchronized and processed RGB-D data, location points were generated and a 3D point cloud map and 2D grid map were incrementally built. Next, the exploration was modeled as a partially observ-able Markov decision process. Partial map simulation and global frontier search methods were combined for autonomous exploration, and dynamic action constraints were utilized in motion control. In this way, the local optimum can be avoided and the exploration ef?cacy can be ensured. Experiments with single connected and multi-branched regions demonstrated the high robustness, ef?ciency, and superiority of the developed system and methods.
ó 2024 THE AUTHORS. Published by Elsevier LTD on behalf of Chinese Academy of Engineering and Higher Education Press Limited Company. This is an open access article under the CC BY-NC-ND license
Articlehistory:Received4February2024Revised25June2024Accepted8November2024Availableonline27December2024Keywords:AutonomousexplorationRed/green/blue-depthSensorfusionPointcloudPartialmapsimulationGlobalfrontiersearch1.IntroductionAutonomousrobotscanacquireinformationontheenviron-mentandassisthumansinvariouscircumstancesandapplications,suchasrescuinghumansfromdanger,providingservicewithinurbanorhomeenvironments,andguidingorotherwiseaidingpeo-pleinneed[1–6].Despitethedramaticdevelopmentofrelativetechnologiesandalgorithms,majorchallengesremain.Thecom-plexityandvariabilityoftheunknownenvironmentmakeitdif?-cultforhumanoperatorstoprovidepriorinformationtoarobot[7].Thus,itisofgreatsigni?cancetoequiparobotwiththecapa-bilitiesofautonomousexplorationandmapping,includingincre-mentalmapconstruction,localization,pathplanning,motioncontrol,navigation,andsoon,withoutdirecthumanintervention.Heuristicandreactiveexplorationapproacheshavebeenstudiedintheliterature.InRef.[8],Yamauchiproposesafrontier-basedheuristicexplorationalgorithm.InRef.[9],basedontheangularuncertaintyofsonarsensor,aheuristicexploration?Correspondingauthor.E-mailaddress:nyu@nankai.edu.cn(N.Yu).strategyisproposedthatusesasonarsensorarrayforperceptionandmapping.InRef.[10],alocalmap-basedfrontiergraphsearchexplorationmethodisdeveloped,andtheenvironmentisrepre-sentedbyatreestructure,makingitef?cienttodeterminethenexttargetpositionevenwithinalargeenvironment.InRef.[11],KeidarandKaminkadescribeamorphologicalfrontierdetectionmethodusinglightdetectionandranging(LIDAR)datatoacceleratethemapfrontierpointdetectionprocess.InRef.[12],areactiveexplorationstrategyisproposedthatusescurrentvelocityandbearingforrapidfrontierselection.Thesimulation-basedautonomousexplorationstrategyhasbeenattractingincreasingresearcheffort,sinceitcanassistinmeasuringinformationandgeneratingthegoalposebasedonacurrentrobot’sstatusduringexploration.InRef.[13],Carrilloetal.proposeautilityfunctionwithShannonandRényientropyinordertomeasuretherobot’sactionsforexploration.InRef.[14],LauriandRitalapresentaforwardsimulationalgorithm,andformulatetheexplorationtoapartiallyobservableMarkovdecisionprocess(POMDP).InRef.[1],withpartiallyknownenvi-ronmentinformation,ared/green/blue-depth(RGB-D)sensorisusedforloopclosuredetectioninanautonomousmappingandexplorationprocess.InRef.[15],Baietal.proposeaGaussianN.Yu,S.Wang/Engineering5(2024)164–172165processregression-basedexplorationstrategy,andpredictmutualinformationwithrobotmotionsamples.InRef.[16],theautono-mousexplorationproblemisformulatedasapartialdifferentialequation;theinformationisdescribedasascalar?eldfromthepriorknownareatotheunknownarea.Inexistingexplorationmethods,rangesensorsaremostlyusedandtwo-dimensional(2D)gridmapsaregenerated.How-ever,a2Dgridmaponlycontainsplanargeometricinformation,andisofteninsuf?cientforhumanperception[17–19].RGB-Dsensorsdirectlyprovidebothcoloranddepthinformation,andcanhelptogenerateathree-dimensional(3D)pointcloudmap.RGB-Dsensor-basedsimultaneouslocalizationandmapping(SLAM)systemswere?rstproposedindependentlybyHenryetal.[20]andEngelhardetal.[21].Imagefeatureswereusedto?ndmatchesamongeachframe,andaniterativeclosestpoint(ICP)algorithmwasappliedtoestimatethepointcloudtransfor-mationsintheRGB-DSLAMsystems.InRef.[22],KleinandMurraypresenttheparalleltrackingandmapping(PTAM)frame-workandimplementtrackingandmappingindualparallelthreads.LabbéandMichaud[23]proposethereal-timeappearance-basedmapping(RTABMAP)algorithm,inwhichloopclosuredetectionthreadandamemorymanagementmethodareadded.Onlineincrementalmappingandloopclosuredetectioncanbeattained,andthemappingef?ciencyandaccuracycanremainconsistentovertime.The?eldofview(FoV)ofexistingRGB-Dsensorsislimited[24].Toobtainagreaterscopeofenvironmentinformation,multipleRGB-Dsensorscanbeusedtogether.DualRGB-DsensorshavebeenutilizedinvisualSLAMforrobustposetrackingandmapping[25],andanRGB-Dsensorhasbeenfusedwithaninertialmea-surementunit(IMU)forindoorwide-rangeenvironmentmapping[26].InRef.[27],Munaroetal.presentanRGB-Dsensornetwork,inwhichmultipleRGB-Dsensorsaredeployedinanindoorenvi-ronmentforhumantracking.InRef.[28],multipleRGB-Dsensorswithnon-overlappingFoVsareemployedforlinedetectionandtracking,thusprovidinga3Dsurroundingviewandmorphologicalinformationontheenvironment.Inourpreviouswork,wedevelopedRGB-D-basedlocalizationandmotionplanningandcontrolmethods[24],andrealizedRGB-D-basedexploration[17].Nevertheless,therobustnessandef?ciencyoftheexplorationprocesswaslimitedduetothesmallFoVoftheRGB-Dsensor.Inthiswork,weuseddualRGB-Dsensorsforthemobilerobot.ThedeploymentofdualRGB-Dsensorspro-videdalargerFoV,butalsointroducedtechnicalchallengesininterference,datasynchronization,andprocessing.Withthesyn-chronizeddatafromthedualRGB-Dsensors,locationpointsweregeneratedanda3Dpointcloudmapand2Dgridmapwereincre-mentallyconstructed.Anautonomousexplorationstrategywasestablishedbycombiningpartialmapsimulationwithglobalfron-tiersearchmethods.Inthisway,thelocaloptimumcanbeavoidedandtheexplorationef?cacycanbeensured.Theexperimentalresultsdemonstratedthehighrobustnessandef?ciencyofthedevelopedsystemandmethods.Thispaperisorganizedasfollows:Section2presentsthe3Dand2DmappingmethodwithdualRGB-Dsensors,andSection3describestheautonomousexplorationalgorithmindetail.Section4describesthemobilerobotsystemthatvalidatestheproposedmappingandexplorationmethod.Section5presentstheexperi-mentsandresults.Finally,Section6concludesthepaper.2.MappingwithdualRGB-DsensorsWiththeinformationprovidedbydualRGB-Dsensors,a3Dpointcloudmapcanbeconstructedinrealtime.Thisprocessincludeslocationpointcreation,mapping,andloopclosuredetection.2.1.LocationpointgenerationFig.1presentstheprocessofcreatinglocationpoints.Inordertoconstructa3Dpointcloudmap,thesensordatamustbepro-cessed;thisincludestheRGBimagesanddepthimagesfromthedualRGB-Dsensors,andthewheelodometrydatafromthemobilechassis.Thedatashouldthenbesynchronized.TheposebetweenthedualRGB-Dsensorsisknown,asitcanbedeterminedbycali-brationaftersystemassembly.Pastinformationisnotincludedinthecurrentframeoftheinformationqueue,includingRGBimages,depthimagesfromthedualRGB-Dsensors,andthewheelodometryinformation.Fig.2depictsthesynchronizationimplementedwiththerobotoperatingsystem(ROS)message?ltermethod.Thesynchronizedsensordataareprocessedtocreatelocationpoints.DualRGB-Dcoloranddepthdataaresimultaneouslycom-binedintoanintegrateddatastructurethatcomprisesthecurrentsensordata,whichcanbeusedforthelocationpointcreationpro-cess.Theorientedfastandrotatedbrief(ORB)featurepointsandcorrespondingdescriptorsareobtainedfromtheRGBimages.The3Dfeaturepointsintherobotframeareintegratedwiththe2Dfea-turepointsandwithcorrespondingdepthdatafromthedepthimages,astheposebetweenthedualRGB-Dcamerasisknown.Thefundamentalmatrixbasedontheepipolarconstraintiscalcu-latedinordertodeterminetheinliers,usingmatched3Dfeaturepointsfromthelastframeandthecurrentframe.Withenoughinliers,datafromthewheelodometryisextractedtoprovideaninitialguessofthetransformationbetweenthelastframeandthecurrentframe.Fig.1.Theprocessoflocationpointscreation.166N.Yu,S.Wang/Engineering5(2024)164–172Fig.2.Sensordatasynchronization.Thetransformationbetweentheframesisestimatedbytheran-domsampleconsensus(RANSAC)algorithmbasedontheperspec-tive–n–points(PnP)model,usingmatched3Dfeaturepoints.Thebagofwords(BoW)ofaframeiscreatedbycorrespondingORBdescriptors,andtheenvironmentvocabularyisincrementallycon-structedwiththeBoWonline,ratherthanpre-traininginthetargetenvironment,asdoingsoismoresuitableforautonomousexplo-rationtasks.Thelocationpoint,Lt,iscreatedwiththeaboveinfor-mationandthecurrenttimestamp,t,andtheedgebetweenthelastandcurrentlocationpointisinitialized,withitsweightsettozero.Deptherrorsandfeaturemismatcherrorsmayexistinanunknownenvironment,especiallyinaplacewithfewfeaturesandvaryingillumination.AsuddenchangeinRGB-Dlocalizationmayoccur,whichcandetrimentallyimpactthereal-timemotioncontrolofthemobilerobot.Furthermore,althoughwheelodome-trychangessmoothly,itcontainsaccumulatederror.However,theRGB-DlocalizationandwheelodometrycanbefusedbymeansofanextendedKalman?lterinordertoachieverobustlocalization.2.2.PointcloudmappingandloopclosuredetectionThemappingandtheloopclosuredetectionprocessaretightlycoupledusinggraphoptimizationandmemorymanagementmethodstoensurestabilityandreal-timeperformance.Thesimi-larityandposeofthelocationpointsareoptimizedinthetree-basednetworkoptimizer(TORO)framework[29]toensureglobalconsistency.Thelocationpointsarestoredintheworkingmemory(WM)orlong-termmemory(LTM),asdeterminedbythesimilarity,asshowninFig.3.ThelocationpointsintheWMareusedforreal-timeloopclosuredetection,whileotherlocationpointsarestoredintheLTMascandidates.ThecandidateswilleitherberetrievedtotheWMordeleted,accordingtotheirsimilarityandstoragetimeintheLTM.First,LtissetasanewvertexofthegraphandthesimilarityofLtandthenearestMlocationpointsiscalculatedbytheirBoWs,asshowninEq.(1),whereKmatchedrepresentsthematchedBoWnumberbetweentheBoWofLtandaBoWfromoneofthenearestNlocationpoints,Lc.KtandKcaretheBoWnumbersofLtandLc,respectively.ThesimilarityofLtandLcismarkedask.Whenkisgreaterthanthethreshold,e,LcismergedintoLtandLtisloadedintoWM.8>>keL
利用双RGB-D 传感器融合增强对未知环境的自主探索和地图构建
![](/skin/haowen/images/icon_star.png)
![](/skin/haowen/images/icon_star.png)
![](/skin/haowen/images/icon_star.png)
![](/skin/haowen/images/icon_star.png)