Summary: | Global localization is a fundamental ability for mobile robots. Considering the limitation of single type of sensor, fusing measurements from multiple sensors with complementary properties is a valuable task for study. In this paper, we propose a decoupled optimization-based framework for global–local sensor fusion, which fuses the intermittent 3D global positions and high-frequent 6D odometry poses to infer the 6D global localization results in real-time. The fusion process is formulated as estimating the relative transformation between global and local reference coordinates, translational extrinsic calibration, and the scale of the local pose estimator. We validate the full observability of the system under general movements, and further analyze the degenerated movement patterns where some related system state would be unobservable. A degeneration-aware sensor fusion method is designed which detects the degenerated directions before optimization, and adds constraints specifically along these directions to relieve the effect of the noise. The proposed degeneration-aware global–local sensor fusion method is validated in both simulation and real-world datasets with different sensor configurations, and shows its effectiveness in terms of accuracy and robustness compared with other decoupled sensor fusion methods for global localization.
|