Combining MEMS-based IMU data and vision-based trajectory estimation

This paper presents an efficient location tracking algorithm that integrates vision-based motion estimation and IMU data. Orientation and translation parameters of the mobile device are estimated from video frames or highly overlapped image sequences acquired with built-in cameras of mobile devices....

Full description

Bibliographic Details
Main Authors: F. Tsai, H. Chang, A. Y. S. Su
Format: Article
Language:English
Published: Copernicus Publications 2014-04-01
Series:The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Online Access:http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XL-4/267/2014/isprsarchives-XL-4-267-2014.pdf
id doaj-8fc1d4a7af3f4929958c1f908ce43c20
record_format Article
spelling doaj-8fc1d4a7af3f4929958c1f908ce43c202020-11-25T00:47:07ZengCopernicus PublicationsThe International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences1682-17502194-90342014-04-01XL-426727110.5194/isprsarchives-XL-4-267-2014Combining MEMS-based IMU data and vision-based trajectory estimationF. Tsai0H. Chang1A. Y. S. Su2Center for Space and Remote Sensing Research, National Central University, TaiwanDepartment of Civil Engineering, National Central University, TaiwanResearch Center for Advanced Science and Technology, National Central University, TaiwanThis paper presents an efficient location tracking algorithm that integrates vision-based motion estimation and IMU data. Orientation and translation parameters of the mobile device are estimated from video frames or highly overlapped image sequences acquired with built-in cameras of mobile devices. IMU data are used to maintain continuity of the orientation estimation between sampling of the image homography calculation. The developed algorithm consists of six primary steps: (1) pre-processing; (2) feature points detection and matching; (3) homography calculation; (4) control points detection and registration; (5) motion estimation and filtering; (6) IMU data integration. The pre-processing of the input video frames or images is to control the sampling rate and image resolution in order to increase the computing efficiency. The overlap rate between selected frames is designed to remain above 60 % for matching. After preprocessing, feature points will be extracted and matched between adjacent frames as the conjugate points. A perspective homography is constructed and used to map one image to another if the co-planar feature points between subsequent images are fully matched. The homography matrix can provide the camera orientation and translation parameters according to the conjugate pairs. An area-based image-matching method is employed to recognize landmarks as reference nodes (RNs). In addition, a filtering mechanism is proposed to ensure the rotation angle was correctly recorded and to increase the tracking accuracy. Comparisons of the trajectory results with different combinations among vision-based motion estimation, filtering mechanism and IMU data integration are evaluated thoroughly and the accuracy is validated with on-site measurement data. Experimental results indicate that the develop algorithm can effectively estimate the trajectory of moving mobile devices and can be used as a cost-effective alternative for LBS device both in outdoor and indoor environment.http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XL-4/267/2014/isprsarchives-XL-4-267-2014.pdf
collection DOAJ
language English
format Article
sources DOAJ
author F. Tsai
H. Chang
A. Y. S. Su
spellingShingle F. Tsai
H. Chang
A. Y. S. Su
Combining MEMS-based IMU data and vision-based trajectory estimation
The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
author_facet F. Tsai
H. Chang
A. Y. S. Su
author_sort F. Tsai
title Combining MEMS-based IMU data and vision-based trajectory estimation
title_short Combining MEMS-based IMU data and vision-based trajectory estimation
title_full Combining MEMS-based IMU data and vision-based trajectory estimation
title_fullStr Combining MEMS-based IMU data and vision-based trajectory estimation
title_full_unstemmed Combining MEMS-based IMU data and vision-based trajectory estimation
title_sort combining mems-based imu data and vision-based trajectory estimation
publisher Copernicus Publications
series The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
issn 1682-1750
2194-9034
publishDate 2014-04-01
description This paper presents an efficient location tracking algorithm that integrates vision-based motion estimation and IMU data. Orientation and translation parameters of the mobile device are estimated from video frames or highly overlapped image sequences acquired with built-in cameras of mobile devices. IMU data are used to maintain continuity of the orientation estimation between sampling of the image homography calculation. The developed algorithm consists of six primary steps: (1) pre-processing; (2) feature points detection and matching; (3) homography calculation; (4) control points detection and registration; (5) motion estimation and filtering; (6) IMU data integration. The pre-processing of the input video frames or images is to control the sampling rate and image resolution in order to increase the computing efficiency. The overlap rate between selected frames is designed to remain above 60 % for matching. After preprocessing, feature points will be extracted and matched between adjacent frames as the conjugate points. A perspective homography is constructed and used to map one image to another if the co-planar feature points between subsequent images are fully matched. The homography matrix can provide the camera orientation and translation parameters according to the conjugate pairs. An area-based image-matching method is employed to recognize landmarks as reference nodes (RNs). In addition, a filtering mechanism is proposed to ensure the rotation angle was correctly recorded and to increase the tracking accuracy. Comparisons of the trajectory results with different combinations among vision-based motion estimation, filtering mechanism and IMU data integration are evaluated thoroughly and the accuracy is validated with on-site measurement data. Experimental results indicate that the develop algorithm can effectively estimate the trajectory of moving mobile devices and can be used as a cost-effective alternative for LBS device both in outdoor and indoor environment.
url http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XL-4/267/2014/isprsarchives-XL-4-267-2014.pdf
work_keys_str_mv AT ftsai combiningmemsbasedimudataandvisionbasedtrajectoryestimation
AT hchang combiningmemsbasedimudataandvisionbasedtrajectoryestimation
AT ayssu combiningmemsbasedimudataandvisionbasedtrajectoryestimation
_version_ 1725261856572964864