Integrated Autonomous Relative Navigation Method Based on Vision and IMU Data Fusion

An integrated autonomous relative navigation method based on vision and IMU data fusion was proposed in this paper, which can improve the position accuracy effectively and has strong adaptability to environmental changes. Firstly, IMU pre-integration formula based on Runge Kutta method was derived,...

Full description

Bibliographic Details
Main Authors: Wenlei Liu, Sentang Wu, Yongming Wen, Xiaolong Wu
Format: Article
Language:English
Published: IEEE 2020-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9022861/
id doaj-4ec645cc43ec4469ac4a5ebd52d0c7a9
record_format Article
spelling doaj-4ec645cc43ec4469ac4a5ebd52d0c7a92021-03-30T01:27:08ZengIEEEIEEE Access2169-35362020-01-018511145112810.1109/ACCESS.2020.29781549022861Integrated Autonomous Relative Navigation Method Based on Vision and IMU Data FusionWenlei Liu0https://orcid.org/0000-0001-6425-9466Sentang Wu1Yongming Wen2https://orcid.org/0000-0003-2595-0734Xiaolong Wu3School of Automation Science and Electrical Engineering, Beihang University, Beijing, ChinaSchool of Automation Science and Electrical Engineering, Beihang University, Beijing, ChinaScience and Technology on Inforamtion Systems Engineering Laboratory, Beijing Institute of Control and Electronics Technology, Beijing, ChinaNavigation and Control Technology Institute, NORINCO Group, Beijing, ChinaAn integrated autonomous relative navigation method based on vision and IMU data fusion was proposed in this paper, which can improve the position accuracy effectively and has strong adaptability to environmental changes. Firstly, IMU pre-integration formula based on Runge Kutta method was derived, which can improve the pre-integration position accuracy and reduce the accumulated error effectively. Secondly, an inverse depth estimation method based on the mixed probability model was proposed during the system initialization process, which can improve the accuracy of camera depth estimation and provide better initial conditions for back-end optimization. Thirdly, a sliding window filtering method based on the probability graph was proposed, which can avoid repeated calculations and improve the sliding window filtering efficiency. Forthly, combined with the advantages of the direct method and the feature point method, a mixed re-projection optimization method was proposed, which can expand the application scope of the method and improve the optimization accuracy effectively. Finally, in the closed-loop optimization, a closed-loop optimization method based on similar transformation is proposed to eliminate the accumulated error. In order to verify the environmental adaptability of the method and the impact of closed-loop detection on the relative navigation system, indoor and outdoor experiments were carried out with a hand-held camera and an IMU. EuRoC dataset was used in the experiments and the proposed method was compared with some classical methods. The experimental results showed that this method has high accuracy and robustness.https://ieeexplore.ieee.org/document/9022861/Data fusionrelative navigationpre-integrationprobability graphsliding window filteringmixed re-projection
collection DOAJ
language English
format Article
sources DOAJ
author Wenlei Liu
Sentang Wu
Yongming Wen
Xiaolong Wu
spellingShingle Wenlei Liu
Sentang Wu
Yongming Wen
Xiaolong Wu
Integrated Autonomous Relative Navigation Method Based on Vision and IMU Data Fusion
IEEE Access
Data fusion
relative navigation
pre-integration
probability graph
sliding window filtering
mixed re-projection
author_facet Wenlei Liu
Sentang Wu
Yongming Wen
Xiaolong Wu
author_sort Wenlei Liu
title Integrated Autonomous Relative Navigation Method Based on Vision and IMU Data Fusion
title_short Integrated Autonomous Relative Navigation Method Based on Vision and IMU Data Fusion
title_full Integrated Autonomous Relative Navigation Method Based on Vision and IMU Data Fusion
title_fullStr Integrated Autonomous Relative Navigation Method Based on Vision and IMU Data Fusion
title_full_unstemmed Integrated Autonomous Relative Navigation Method Based on Vision and IMU Data Fusion
title_sort integrated autonomous relative navigation method based on vision and imu data fusion
publisher IEEE
series IEEE Access
issn 2169-3536
publishDate 2020-01-01
description An integrated autonomous relative navigation method based on vision and IMU data fusion was proposed in this paper, which can improve the position accuracy effectively and has strong adaptability to environmental changes. Firstly, IMU pre-integration formula based on Runge Kutta method was derived, which can improve the pre-integration position accuracy and reduce the accumulated error effectively. Secondly, an inverse depth estimation method based on the mixed probability model was proposed during the system initialization process, which can improve the accuracy of camera depth estimation and provide better initial conditions for back-end optimization. Thirdly, a sliding window filtering method based on the probability graph was proposed, which can avoid repeated calculations and improve the sliding window filtering efficiency. Forthly, combined with the advantages of the direct method and the feature point method, a mixed re-projection optimization method was proposed, which can expand the application scope of the method and improve the optimization accuracy effectively. Finally, in the closed-loop optimization, a closed-loop optimization method based on similar transformation is proposed to eliminate the accumulated error. In order to verify the environmental adaptability of the method and the impact of closed-loop detection on the relative navigation system, indoor and outdoor experiments were carried out with a hand-held camera and an IMU. EuRoC dataset was used in the experiments and the proposed method was compared with some classical methods. The experimental results showed that this method has high accuracy and robustness.
topic Data fusion
relative navigation
pre-integration
probability graph
sliding window filtering
mixed re-projection
url https://ieeexplore.ieee.org/document/9022861/
work_keys_str_mv AT wenleiliu integratedautonomousrelativenavigationmethodbasedonvisionandimudatafusion
AT sentangwu integratedautonomousrelativenavigationmethodbasedonvisionandimudatafusion
AT yongmingwen integratedautonomousrelativenavigationmethodbasedonvisionandimudatafusion
AT xiaolongwu integratedautonomousrelativenavigationmethodbasedonvisionandimudatafusion
_version_ 1724187012401987584