Skyline matching based camera orientation from images and mobile mapping point clouds
Mobile Mapping is widely used for collecting large amounts of geo-referenced data. An important role plays sensor fusion, in order to evaluate multiple sensors such as laser scanner and cameras jointly. This requires to determine the relative orientation between sensors. Based on data of a <i>...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Copernicus Publications
2014-05-01
|
Series: | ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences |
Online Access: | http://www.isprs-ann-photogramm-remote-sens-spatial-inf-sci.net/II-5/181/2014/isprsannals-II-5-181-2014.pdf |
Summary: | Mobile Mapping is widely used for collecting large amounts of geo-referenced data. An important role plays sensor fusion, in order to
evaluate multiple sensors such as laser scanner and cameras jointly. This requires to determine the relative orientation between sensors.
Based on data of a <i>RIEGL</i> VMX-250 mobile mapping system equipped with two laser scanners, four optional cameras, and a highly
precise GNSS/IMU system, we propose an approach to improve camera orientations. A manually determined orientation is used as an
initial approximation for matching a large number of points in optical images and the corresponding projected scan images. The search
space of the point correspondences is reduced to skylines found in both the optical as well as the scan image. The skyline determination
is based on alpha shapes, the actual matching is done via an adapted ICP algorithm. The approximate values of the relative orientation
are used as starting values for an iterative resection process. Outliers are removed at several stages of the process. Our approach is fully
automatic and improves the camera orientation significantly. |
---|---|
ISSN: | 2194-9042 2194-9050 |