REAL-TIME LARGE SCALE 3D RECONSTRUCTION BY FUSING KINECT AND IMU DATA

Kinect-style RGB-D cameras have been used to build large scale dense 3D maps for indoor environments. These maps can serve many purposes such as robot navigation, and augmented reality. However, to generate dense 3D maps of large scale environments is still very challenging. In this paper, we presen...

Full description

Bibliographic Details
Main Authors: J. Huai, Y. Zhang, A. Yilmaz
Format: Article
Language:English
Published: Copernicus Publications 2015-08-01
Series:ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Online Access:http://www.isprs-ann-photogramm-remote-sens-spatial-inf-sci.net/II-3-W5/491/2015/isprsannals-II-3-W5-491-2015.pdf
id doaj-c060b481e1c2487f889f081e1b3265ad
record_format Article
spelling doaj-c060b481e1c2487f889f081e1b3265ad2020-11-24T21:55:34ZengCopernicus PublicationsISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences2194-90422194-90502015-08-01II-3-W549149610.5194/isprsannals-II-3-W5-491-2015REAL-TIME LARGE SCALE 3D RECONSTRUCTION BY FUSING KINECT AND IMU DATAJ. Huai0Y. Zhang1A. Yilmaz2Dept. of Civil, Environmental and Geodetic Engineering Dept, Ohio State University, 2036 Neil Avenue, Columbus, OH 43210 USADept. of Civil, Environmental and Geodetic Engineering Dept, Ohio State University, 2036 Neil Avenue, Columbus, OH 43210 USADept. of Civil, Environmental and Geodetic Engineering Dept, Ohio State University, 2036 Neil Avenue, Columbus, OH 43210 USAKinect-style RGB-D cameras have been used to build large scale dense 3D maps for indoor environments. These maps can serve many purposes such as robot navigation, and augmented reality. However, to generate dense 3D maps of large scale environments is still very challenging. In this paper, we present a mapping system for 3D reconstruction that fuses measurements from a Kinect and an inertial measurement unit (IMU) to estimate motion. Our major achievements include: (i) Large scale consistent 3D reconstruction is realized by volume shifting and loop closure; (ii) The coarse-to-fine iterative closest point (ICP) algorithm, the SIFT odometry, and IMU odometry are combined to robustly and precisely estimate pose. In particular, ICP runs routinely to track the Kinect motion. If ICP fails in planar areas, the SIFT odometry provides incremental motion estimate. If both ICP and the SIFT odometry fail, e.g., upon abrupt motion or inadequate features, the incremental motion is estimated by the IMU. Additionally, the IMU also observes the roll and pitch angles which can reduce long-term drift of the sensor assembly. In experiments on a consumer laptop, our system estimates motion at 8Hz on average while integrating color images to the local map and saving volumes of meshes concurrently. Moreover, it is immune to tracking failures, and has smaller drift than the state-of-the-art systems in large scale reconstruction.http://www.isprs-ann-photogramm-remote-sens-spatial-inf-sci.net/II-3-W5/491/2015/isprsannals-II-3-W5-491-2015.pdf
collection DOAJ
language English
format Article
sources DOAJ
author J. Huai
Y. Zhang
A. Yilmaz
spellingShingle J. Huai
Y. Zhang
A. Yilmaz
REAL-TIME LARGE SCALE 3D RECONSTRUCTION BY FUSING KINECT AND IMU DATA
ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
author_facet J. Huai
Y. Zhang
A. Yilmaz
author_sort J. Huai
title REAL-TIME LARGE SCALE 3D RECONSTRUCTION BY FUSING KINECT AND IMU DATA
title_short REAL-TIME LARGE SCALE 3D RECONSTRUCTION BY FUSING KINECT AND IMU DATA
title_full REAL-TIME LARGE SCALE 3D RECONSTRUCTION BY FUSING KINECT AND IMU DATA
title_fullStr REAL-TIME LARGE SCALE 3D RECONSTRUCTION BY FUSING KINECT AND IMU DATA
title_full_unstemmed REAL-TIME LARGE SCALE 3D RECONSTRUCTION BY FUSING KINECT AND IMU DATA
title_sort real-time large scale 3d reconstruction by fusing kinect and imu data
publisher Copernicus Publications
series ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
issn 2194-9042
2194-9050
publishDate 2015-08-01
description Kinect-style RGB-D cameras have been used to build large scale dense 3D maps for indoor environments. These maps can serve many purposes such as robot navigation, and augmented reality. However, to generate dense 3D maps of large scale environments is still very challenging. In this paper, we present a mapping system for 3D reconstruction that fuses measurements from a Kinect and an inertial measurement unit (IMU) to estimate motion. Our major achievements include: (i) Large scale consistent 3D reconstruction is realized by volume shifting and loop closure; (ii) The coarse-to-fine iterative closest point (ICP) algorithm, the SIFT odometry, and IMU odometry are combined to robustly and precisely estimate pose. In particular, ICP runs routinely to track the Kinect motion. If ICP fails in planar areas, the SIFT odometry provides incremental motion estimate. If both ICP and the SIFT odometry fail, e.g., upon abrupt motion or inadequate features, the incremental motion is estimated by the IMU. Additionally, the IMU also observes the roll and pitch angles which can reduce long-term drift of the sensor assembly. In experiments on a consumer laptop, our system estimates motion at 8Hz on average while integrating color images to the local map and saving volumes of meshes concurrently. Moreover, it is immune to tracking failures, and has smaller drift than the state-of-the-art systems in large scale reconstruction.
url http://www.isprs-ann-photogramm-remote-sens-spatial-inf-sci.net/II-3-W5/491/2015/isprsannals-II-3-W5-491-2015.pdf
work_keys_str_mv AT jhuai realtimelargescale3dreconstructionbyfusingkinectandimudata
AT yzhang realtimelargescale3dreconstructionbyfusingkinectandimudata
AT ayilmaz realtimelargescale3dreconstructionbyfusingkinectandimudata
_version_ 1725861838799765504