A multi-sensor traffic scene dataset with omnidirectional video
The development of vehicles that perceive their environment, in particular those using computer vision, indispensably requires large databases of sensor recordings obtained from real cars driven in realistic traffic situations. These datasets should be time shaped for enabling synchronization of sen...
Main Authors: | , , , , , |
---|---|
Format: | Others |
Language: | English |
Published: |
Linköpings universitet, Datorseende
2013
|
Online Access: | http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-93277 |
id |
ndltd-UPSALLA1-oai-DiVA.org-liu-93277 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-UPSALLA1-oai-DiVA.org-liu-932772013-06-28T16:02:57ZA multi-sensor traffic scene dataset with omnidirectional videoengKoschorrek, PhilippPiccini, TommasoÖberg, PerFelsberg, MichaelNielsen, LarsMester, RudolfLinköpings universitet, DatorseendeLinköpings universitet, Tekniska högskolanLinköpings universitet, DatorseendeLinköpings universitet, Tekniska högskolanLinköpings universitet, DatorseendeLinköpings universitet, FordonssystemLinköpings universitet, Tekniska högskolanLinköpings universitet, DatorseendeLinköpings universitet, Tekniska högskolanLinköpings universitet, FordonssystemLinköpings universitet, Tekniska högskolanLinköpings universitet, DatorseendeLinköpings universitet, Tekniska högskolanUniversity of Frankfurt, Germany2013The development of vehicles that perceive their environment, in particular those using computer vision, indispensably requires large databases of sensor recordings obtained from real cars driven in realistic traffic situations. These datasets should be time shaped for enabling synchronization of sensor data from different sources. Furthermore, full surround environment perception requires high frame rates of synchronized omnidirectional video data to prevent information loss at any speeds. This paper describes an experimental setup and software environment for recording such synchronized multi-sensor data streams and storing them in a new open source format. The dataset consists of sequences recorded in various environments from a car equipped with an omnidirectional multi-camera, height sensors, an IMU, a velocity sensor, and a GPS. The software environment for reading these data sets will be provided to the public, together with a collection of long multi-sensor and multi-camera data streams stored in the developed format. Conference paperinfo:eu-repo/semantics/conferenceObjecttexthttp://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-93277application/pdfinfo:eu-repo/semantics/openAccess |
collection |
NDLTD |
language |
English |
format |
Others
|
sources |
NDLTD |
description |
The development of vehicles that perceive their environment, in particular those using computer vision, indispensably requires large databases of sensor recordings obtained from real cars driven in realistic traffic situations. These datasets should be time shaped for enabling synchronization of sensor data from different sources. Furthermore, full surround environment perception requires high frame rates of synchronized omnidirectional video data to prevent information loss at any speeds. This paper describes an experimental setup and software environment for recording such synchronized multi-sensor data streams and storing them in a new open source format. The dataset consists of sequences recorded in various environments from a car equipped with an omnidirectional multi-camera, height sensors, an IMU, a velocity sensor, and a GPS. The software environment for reading these data sets will be provided to the public, together with a collection of long multi-sensor and multi-camera data streams stored in the developed format. |
author |
Koschorrek, Philipp Piccini, Tommaso Öberg, Per Felsberg, Michael Nielsen, Lars Mester, Rudolf |
spellingShingle |
Koschorrek, Philipp Piccini, Tommaso Öberg, Per Felsberg, Michael Nielsen, Lars Mester, Rudolf A multi-sensor traffic scene dataset with omnidirectional video |
author_facet |
Koschorrek, Philipp Piccini, Tommaso Öberg, Per Felsberg, Michael Nielsen, Lars Mester, Rudolf |
author_sort |
Koschorrek, Philipp |
title |
A multi-sensor traffic scene dataset with omnidirectional video |
title_short |
A multi-sensor traffic scene dataset with omnidirectional video |
title_full |
A multi-sensor traffic scene dataset with omnidirectional video |
title_fullStr |
A multi-sensor traffic scene dataset with omnidirectional video |
title_full_unstemmed |
A multi-sensor traffic scene dataset with omnidirectional video |
title_sort |
multi-sensor traffic scene dataset with omnidirectional video |
publisher |
Linköpings universitet, Datorseende |
publishDate |
2013 |
url |
http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-93277 |
work_keys_str_mv |
AT koschorrekphilipp amultisensortrafficscenedatasetwithomnidirectionalvideo AT piccinitommaso amultisensortrafficscenedatasetwithomnidirectionalvideo AT obergper amultisensortrafficscenedatasetwithomnidirectionalvideo AT felsbergmichael amultisensortrafficscenedatasetwithomnidirectionalvideo AT nielsenlars amultisensortrafficscenedatasetwithomnidirectionalvideo AT mesterrudolf amultisensortrafficscenedatasetwithomnidirectionalvideo AT koschorrekphilipp multisensortrafficscenedatasetwithomnidirectionalvideo AT piccinitommaso multisensortrafficscenedatasetwithomnidirectionalvideo AT obergper multisensortrafficscenedatasetwithomnidirectionalvideo AT felsbergmichael multisensortrafficscenedatasetwithomnidirectionalvideo AT nielsenlars multisensortrafficscenedatasetwithomnidirectionalvideo AT mesterrudolf multisensortrafficscenedatasetwithomnidirectionalvideo |
_version_ |
1716590285558906880 |