IMAGE-BASED ORIENTATION DETERMINATION OF MOBILE SENSOR PLATFORMS

Estimating the pose of a mobile robotic platform is a challenging task, especially when the pose needs to be estimated in a global or local reference frame and when the estimation has to be performed while the platform is moving. While the position of a platform can be measured directly via modern t...

Full description

Bibliographic Details
Main Authors: O. Hasler, S. Nebiker
Format: Article
Language:English
Published: Copernicus Publications 2021-06-01
Series:The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Online Access:https://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XLIII-B1-2021/215/2021/isprs-archives-XLIII-B1-2021-215-2021.pdf
id doaj-298d151ec8004bc7bb11590ab4ff9c32
record_format Article
spelling doaj-298d151ec8004bc7bb11590ab4ff9c322021-06-28T22:02:08ZengCopernicus PublicationsThe International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences1682-17502194-90342021-06-01XLIII-B1-202121522010.5194/isprs-archives-XLIII-B1-2021-215-2021IMAGE-BASED ORIENTATION DETERMINATION OF MOBILE SENSOR PLATFORMSO. Hasler0S. Nebiker1Institute of Geomatics Engineering, FHNW University of Applied Sciences and Arts Northwestern Switzerland, Muttenz, SwitzerlandInstitute of Geomatics Engineering, FHNW University of Applied Sciences and Arts Northwestern Switzerland, Muttenz, SwitzerlandEstimating the pose of a mobile robotic platform is a challenging task, especially when the pose needs to be estimated in a global or local reference frame and when the estimation has to be performed while the platform is moving. While the position of a platform can be measured directly via modern tachymetry or with the help of a global positioning service GNSS, the absolute platform orientation is harder to derive. Most often, only the relative orientation is estimated with the help of a sensor mounted on the robotic platform such as an IMU, with one or multiple cameras, with a laser scanner or with a combination of any of those. Then, a sensor fusion of the relative orientation and the absolute position is performed. In this work, an additional approach is presented: first, an image-based relative pose estimation with frames from a panoramic camera using a state-of-the-art visual odometry implementation is performed. Secondly, the position of the platform in a reference system is estimated using motorized tachymetry. Lastly, the absolute orientation is calculated using a visual marker, which is placed in the space, where the robotic platform is moving. The marker can be detected in the camera frame and since the position of this marker is known in the reference system, the absolute pose can be estimated. To improve the absolute pose estimation, a sensor fusion is conducted. Results with a Lego model train as a mobile platform show, that the trajectory of the absolute pose calculated independently with four different markers have a deviation < 0.66 degrees 50% of the time and that the average difference is < 1.17 degrees. The implementation is based on the popular Robotic Operating System ROS.https://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XLIII-B1-2021/215/2021/isprs-archives-XLIII-B1-2021-215-2021.pdf
collection DOAJ
language English
format Article
sources DOAJ
author O. Hasler
S. Nebiker
spellingShingle O. Hasler
S. Nebiker
IMAGE-BASED ORIENTATION DETERMINATION OF MOBILE SENSOR PLATFORMS
The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
author_facet O. Hasler
S. Nebiker
author_sort O. Hasler
title IMAGE-BASED ORIENTATION DETERMINATION OF MOBILE SENSOR PLATFORMS
title_short IMAGE-BASED ORIENTATION DETERMINATION OF MOBILE SENSOR PLATFORMS
title_full IMAGE-BASED ORIENTATION DETERMINATION OF MOBILE SENSOR PLATFORMS
title_fullStr IMAGE-BASED ORIENTATION DETERMINATION OF MOBILE SENSOR PLATFORMS
title_full_unstemmed IMAGE-BASED ORIENTATION DETERMINATION OF MOBILE SENSOR PLATFORMS
title_sort image-based orientation determination of mobile sensor platforms
publisher Copernicus Publications
series The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
issn 1682-1750
2194-9034
publishDate 2021-06-01
description Estimating the pose of a mobile robotic platform is a challenging task, especially when the pose needs to be estimated in a global or local reference frame and when the estimation has to be performed while the platform is moving. While the position of a platform can be measured directly via modern tachymetry or with the help of a global positioning service GNSS, the absolute platform orientation is harder to derive. Most often, only the relative orientation is estimated with the help of a sensor mounted on the robotic platform such as an IMU, with one or multiple cameras, with a laser scanner or with a combination of any of those. Then, a sensor fusion of the relative orientation and the absolute position is performed. In this work, an additional approach is presented: first, an image-based relative pose estimation with frames from a panoramic camera using a state-of-the-art visual odometry implementation is performed. Secondly, the position of the platform in a reference system is estimated using motorized tachymetry. Lastly, the absolute orientation is calculated using a visual marker, which is placed in the space, where the robotic platform is moving. The marker can be detected in the camera frame and since the position of this marker is known in the reference system, the absolute pose can be estimated. To improve the absolute pose estimation, a sensor fusion is conducted. Results with a Lego model train as a mobile platform show, that the trajectory of the absolute pose calculated independently with four different markers have a deviation < 0.66 degrees 50% of the time and that the average difference is < 1.17 degrees. The implementation is based on the popular Robotic Operating System ROS.
url https://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XLIII-B1-2021/215/2021/isprs-archives-XLIII-B1-2021-215-2021.pdf
work_keys_str_mv AT ohasler imagebasedorientationdeterminationofmobilesensorplatforms
AT snebiker imagebasedorientationdeterminationofmobilesensorplatforms
_version_ 1721355848802893824