Abstract
Background: Loop closure detection is a crucial part in robot navigation and simultaneous location and mapping (SLAM). Appearance-based loop closure detection still faces many challenges, such as illumination changes, perceptual aliasing and increasing computational complexity.
Methods: In this paper, we proposed a visual loop closure detection algorithm that combines illumination robust descriptor DIRD and odometry information. In this algorithm, a new distance function is built by fusing the Euclidean distance function and Mahalanobis distance function, which integrates the pose uncertainty of body and can dynamically adjust the threshold of potential loop closure locations. Then, potential locations are verified by calculating the similarity of DIRD descriptors.
Results: The proposed algorithm is evaluated on KITTI and EuRoC datasets, and is compared with SeqSLAM algorithm, which is one of the state of the art loop closure detection algorithms. The results show that the proposed algorithm could effectively reduce the computing time and get better performance on P-R curve.
Conclusion: The new loop closure detection method makes full use of odometry information and image appearance information. The application of the new distance function can effectively reduce the missed detection caused by odometry error accumulation. The algorithm does not require extracting image features or learning stage, and can realize real-time detection and run on the platform with limited computational power.
Keywords: Simultaneous location and mapping (SLAM), illumination robust, visual inertial odometry (VIO), loop closure candidate area, pose constraint.
Graphical Abstract
[http://dx.doi.org/10.1080/01691864.2015.1057616]
[http://dx.doi.org/10.1109/MRA.2006.1638022]
[http://dx.doi.org/10.1177/0278364909359118]
[http://dx.doi.org/10.1007/s11263-007-0065-9]
[http://dx.doi.org/10.1109/TRO.2008.2004620]]
[http://dx.doi.org/10.1109/ROBOT.2010.5509587]
[http://dx.doi.org/10.1177/0278364909341483]
[http://dx.doi.org/10.1177/0278364910385483]
[http://dx.doi.org/10.1109/TRO.2013.2242375]
[http://dx.doi.org/10.1109/ICRA.2012.6224623]
[http://dx.doi.org/10.1109/IVS.2014.6856421]
[http://dx.doi.org/10.1109/TRO.2015.2496823]
[http://dx.doi.org/10.1177/0278364912438273]
[http://dx.doi.org/10.1109/ICCV.2003.1238663]
[http://dx.doi.org/10.1177/0278364908090961]
[http://dx.doi.org/10.1109/ICRA.2015.7139959]
[http://dx.doi.org/10.1109/LRA.2018.2849609]
[http://dx.doi.org/10.15607/RSS.2015.XI.022]
[http://dx.doi.org/10.1109/IROS.2016.7759685]
[http://dx.doi.org/10.1109/IROS.2017.8206289]
[http://dx.doi.org/10.1109/TPAMI.2011.222 ] [PMID: 22084141]
[http://dx.doi.org/10.1109/ICCV.2011.6126544]
[http://dx.doi.org/10.1016/S0079-6123(06)55002-2]
[http://dx.doi.org/10.1109/IVS.2013.6629483]
[http://dx.doi.org/10.1109/ICRA.2014.6907067]
[http://dx.doi.org/10.3390/s18040939] [PMID: 29565310]
[http://dx.doi.org/10.1177/0278364909340592]
[http://dx.doi.org/10.1177/0278364913491297]
[http://dx.doi.org/10.1177/0278364915620033]
[http://dx.doi.org/10.23919/ICIF.2018.8455807]