Accuracy Evaluation of Stereo Vision Aided Inertial Navigation for Indoor Environments
Accurate knowledge of position and orientation is a prerequisite for many applications regarding unmanned navigation, mapping, or environmental modelling. GPS-aided inertial navigation is the preferred solution for outdoor applications. Nevertheless a similar solution for navigation tasks in difficu...
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Copernicus Publications
2013-11-01
|
Series: | The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences |
Online Access: | http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XL-4-W4/13/2013/isprsarchives-XL-4-W4-13-2013.pdf |
Summary: | Accurate knowledge of position and orientation is a prerequisite for many applications regarding unmanned navigation, mapping, or
environmental modelling. GPS-aided inertial navigation is the preferred solution for outdoor applications. Nevertheless a similar
solution for navigation tasks in difficult environments with erroneous or no GPS-data is needed. Therefore a stereo vision aided inertial
navigation system is presented which is capable of providing real-time local navigation for indoor applications.
<br><br>
A method is described to reconstruct the ego motion of a stereo camera system aided by inertial data. This, in turn, is used to constrain
the inertial sensor drift. The optical information is derived from natural landmarks, extracted and tracked over consequent stereo image
pairs. Using inertial data for feature tracking effectively reduces computational costs and at the same time increases the reliability due
to constrained search areas. Mismatched features, e.g. at repetitive structures typical for indoor environments are avoided.
<br><br>
An Integrated Positioning System (IPS) was deployed and tested on an indoor navigation task. IPS was evaluated for accuracy, robustness,
and repeatability in a common office environment. In combination with a dense disparity map, derived from the navigation
cameras, a high density point cloud is generated to show the capability of the navigation algorithm. |
---|---|
ISSN: | 1682-1750 2194-9034 |