Integrated Autonomous Relative Navigation Method Based on Vision and IMU Data Fusion
An integrated autonomous relative navigation method based on vision and IMU data fusion was proposed in this paper, which can improve the position accuracy effectively and has strong adaptability to environmental changes. Firstly, IMU pre-integration formula based on Runge Kutta method was derived,...
Main Authors: | Wenlei Liu, Sentang Wu, Yongming Wen, Xiaolong Wu |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2020-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9022861/ |
Similar Items
-
Multi-rate Sensor Fusion for GPS Navigation Using Kalman Filtering
by: Mayhew, David McNeil
Published: (2014) -
A Pedestrian Detection Algorithm Based on Score Fusion for Multi-LiDAR Systems
by: Tao Wu, et al.
Published: (2021-02-01) -
Mixed Probability Inverse Depth Estimation Based on Probabilistic Graph Model
by: Wenlei Liu, et al.
Published: (2019-01-01) -
An Enhanced Spatial and Temporal Data Fusion Model for Fusing Landsat and MODIS Surface Reflectance to Generate High Temporal Landsat-Like Data
by: Chengquan Huang, et al.
Published: (2013-10-01) -
Navigation of Mobile Robots in Natural Environments: Using Sensor Fusion in Forestry
by: Jürgen Rossmann, et al.
Published: (2010-06-01)