On-the-Fly Camera and Lidar Calibration
Sensor fusion is one of the main challenges in self driving and robotics applications. In this paper we propose an automatic, online and target-less camera-Lidar extrinsic calibration approach. We adopt a structure from motion (SfM) method to generate 3D point clouds from the camera data which can b...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2020-04-01
|
Series: | Remote Sensing |
Subjects: | |
Online Access: | https://www.mdpi.com/2072-4292/12/7/1137 |
id |
doaj-d1bc0bb95f4340798b9366ab8e12a976 |
---|---|
record_format |
Article |
spelling |
doaj-d1bc0bb95f4340798b9366ab8e12a9762020-11-25T02:21:57ZengMDPI AGRemote Sensing2072-42922020-04-01121137113710.3390/rs12071137On-the-Fly Camera and Lidar CalibrationBalázs Nagy0Csaba Benedek1Machine Perception Research Laboratory, Institute for Computer Science and Control, Kende Str. 13-17, 1111 Budapest, HungaryMachine Perception Research Laboratory, Institute for Computer Science and Control, Kende Str. 13-17, 1111 Budapest, HungarySensor fusion is one of the main challenges in self driving and robotics applications. In this paper we propose an automatic, online and target-less camera-Lidar extrinsic calibration approach. We adopt a structure from motion (SfM) method to generate 3D point clouds from the camera data which can be matched to the Lidar point clouds; thus, we address the extrinsic calibration problem as a registration task in the 3D domain. The core step of the approach is a two-stage transformation estimation: First, we introduce an object level coarse alignment algorithm operating in the Hough space to transform the SfM-based and the Lidar point clouds into a common coordinate system. Thereafter, we apply a control point based nonrigid transformation refinement step to register the point clouds more precisely. Finally, we calculate the correspondences between the 3D Lidar points and the pixels in the 2D camera domain. We evaluated the method in various real-life traffic scenarios in Budapest, Hungary. The results show that our proposed extrinsic calibration approach is able to provide accurate and robust parameter settings on-the-fly.https://www.mdpi.com/2072-4292/12/7/1137lidarcameraextrinsic calibrationregistration |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Balázs Nagy Csaba Benedek |
spellingShingle |
Balázs Nagy Csaba Benedek On-the-Fly Camera and Lidar Calibration Remote Sensing lidar camera extrinsic calibration registration |
author_facet |
Balázs Nagy Csaba Benedek |
author_sort |
Balázs Nagy |
title |
On-the-Fly Camera and Lidar Calibration |
title_short |
On-the-Fly Camera and Lidar Calibration |
title_full |
On-the-Fly Camera and Lidar Calibration |
title_fullStr |
On-the-Fly Camera and Lidar Calibration |
title_full_unstemmed |
On-the-Fly Camera and Lidar Calibration |
title_sort |
on-the-fly camera and lidar calibration |
publisher |
MDPI AG |
series |
Remote Sensing |
issn |
2072-4292 |
publishDate |
2020-04-01 |
description |
Sensor fusion is one of the main challenges in self driving and robotics applications. In this paper we propose an automatic, online and target-less camera-Lidar extrinsic calibration approach. We adopt a structure from motion (SfM) method to generate 3D point clouds from the camera data which can be matched to the Lidar point clouds; thus, we address the extrinsic calibration problem as a registration task in the 3D domain. The core step of the approach is a two-stage transformation estimation: First, we introduce an object level coarse alignment algorithm operating in the Hough space to transform the SfM-based and the Lidar point clouds into a common coordinate system. Thereafter, we apply a control point based nonrigid transformation refinement step to register the point clouds more precisely. Finally, we calculate the correspondences between the 3D Lidar points and the pixels in the 2D camera domain. We evaluated the method in various real-life traffic scenarios in Budapest, Hungary. The results show that our proposed extrinsic calibration approach is able to provide accurate and robust parameter settings on-the-fly. |
topic |
lidar camera extrinsic calibration registration |
url |
https://www.mdpi.com/2072-4292/12/7/1137 |
work_keys_str_mv |
AT balazsnagy ontheflycameraandlidarcalibration AT csababenedek ontheflycameraandlidarcalibration |
_version_ |
1724864364348964864 |