Compact Spatial Pyramid Pooling Deep Convolutional Neural Network Based Hand Gestures Decoder

Current deep learning convolutional neural network (DCNN) -based hand gesture detectors with acute precision demand incredibly high-performance computing power. Although DCNN-based detectors are capable of accurate classification, the sheer computing power needed for this form of classification make...

Full description

Bibliographic Details
Main Authors: Akm Ashiquzzaman, Hyunmin Lee, Kwangki Kim, Hye-Young Kim, Jaehyung Park, Jinsul Kim
Format: Article
Language:English
Published: MDPI AG 2020-11-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/10/21/7898
Description
Summary:Current deep learning convolutional neural network (DCNN) -based hand gesture detectors with acute precision demand incredibly high-performance computing power. Although DCNN-based detectors are capable of accurate classification, the sheer computing power needed for this form of classification makes it very difficult to run with lower computational power in remote environments. Moreover, classical DCNN architectures have a fixed number of input dimensions, which forces preprocessing, thus making it impractical for real-world applications. In this research, a practical DCNN with an optimized architecture is proposed with DCNN filter/node pruning, and spatial pyramid pooling (SPP) is introduced in order to make the model input dimension-invariant. This compact SPP-DCNN module uses <inline-formula><math display="inline"><semantics><mrow><mn>65</mn><mo>%</mo></mrow></semantics></math></inline-formula> fewer parameters than traditional classifiers and operates almost <inline-formula><math display="inline"><semantics><mrow><mn>3</mn><mo>×</mo></mrow></semantics></math></inline-formula> faster than classical models. Moreover, the new improved proposed algorithm, which decodes gestures or sign language finger-spelling from videos, gave a benchmark highest accuracy with the fastest processing speed. This proposed method paves the way for various practical and applied hand gesture input-based human-computer interaction (HCI) applications.
ISSN:2076-3417