Robust Lane-Mark Extraction for Autonomous Driving Under Complex Real Conditions

Lane marks on roads are among the most important items of road scene information in the process of autonomous driving, and lane-mark extraction based on visual cognitive computing is one of the most important components of advanced driving assistance systems in intelligent transportation system. Onb...

Full description

Bibliographic Details
Main Authors: Hanyu Xuan, Hongzhe Liu, Jiazheng Yuan, Qing Li
Format: Article
Language:English
Published: IEEE 2018-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8000310/
Description
Summary:Lane marks on roads are among the most important items of road scene information in the process of autonomous driving, and lane-mark extraction based on visual cognitive computing is one of the most important components of advanced driving assistance systems in intelligent transportation system. Onboard cameras mounted on the front of autonomous vehicles capture road scene images from which lane marks are extracted. This paper proposes a new lane-mark extraction algorithm with four major parts. First, this paper handles the road images captured from onboard cameras by grayscale and fast median filter. Then, we exploit the characteristics of lane marks in road images as constraints to propose lane-features filter based on multi-constraints used to extract lane marks. Then, a clustering algorithm based on the double point removal of a p-least squares algorithm is proposed to cluster features, and recursive dichotomy algorithm is used to fit the candidate lane marks. Finally, we carry out verification and optimization on candidate lane marks to obtain more accurate and stable extraction results. In our experiment, we divide the common complex road scenes into four categories. The results show that the proposed method can robustly extract lane marks under various complex real conditions. This paper also proposes forward a method to evaluate the results of lane-mark extraction, and partial test results are evaluated.
ISSN:2169-3536