Pedestrian Detection Using Quaternion Gradient Based Weber Local Descriptor

In the past decades, pedestrian detection has attracted more attention in many practical applications. In this paper, a novel pedestrian detection method using Quaternion Gradient based Weber Local Descriptor (QGWLD) is proposed. Unlike many other local pedestrian detectors that only extracted featu...

Full description

Bibliographic Details
Main Author: Guoyun Lian
Format: Article
Language:English
Published: IEEE 2021-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9366761/
Description
Summary:In the past decades, pedestrian detection has attracted more attention in many practical applications. In this paper, a novel pedestrian detection method using Quaternion Gradient based Weber Local Descriptor (QGWLD) is proposed. Unlike many other local pedestrian detectors that only extracted features from the gray-scale image, which ignored the color information, a new powerful and robust local pedestrian detector is developed, which integrates the advantages of both the color and texture information and can be used to address the pedestrian detection problem well. As we know, the quaternion can entirely characterize a color object well and the WLD feature consisting of the gradient orientation and differential excitation has acquired a good performance for pedestrian detection. Therefore, combining the WLD feature with the quaternion representation, the QGWLD pedestrian detector is presented. Firstly, the quaternion gradient representation is performed on the color image in the sliding window, and then the Weber Local Descriptor (WLD) is extracted over the quaternion gradient feature map. After that, the QGWLD histogram is constructed to characterize the sliding window. Experimental results on INRIA and PennFudanPed pedestrian databases validate the effectiveness of the proposed QGWLD pedestrian detector. Comparing with the similar pedestrian detectors, the proposed QGWLD pedestrian detector performs better.
ISSN:2169-3536