Head Pose Detection for a Wearable Parrot-Inspired Robot Based on Deep Learning

Extensive research has been conducted in human head pose detection systems and several applications have been identified to deploy such systems. Deep learning based head pose detection is one such method which has been studied for several decades and reports high success rates during implementation....

Full description

Bibliographic Details
Main Authors: Jaishankar Bharatharaj, Loulin Huang, Rajesh Elara Mohan, Thejus Pathmakumar, Chris Krägeloh, Ahmed Al-Jumaily
Format: Article
Language:English
Published: MDPI AG 2018-07-01
Series:Applied Sciences
Subjects:
Online Access:http://www.mdpi.com/2076-3417/8/7/1081
Description
Summary:Extensive research has been conducted in human head pose detection systems and several applications have been identified to deploy such systems. Deep learning based head pose detection is one such method which has been studied for several decades and reports high success rates during implementation. Across several pet robots designed and developed for various needs, there is a complete absence of wearable pet robots and head pose detection models in wearable pet robots. Designing a wearable pet robot capable of head pose detection can provide more opportunities for research and development of such systems. In this paper, we present a novel head pose detection system for a wearable parrot-inspired pet robot using images taken from the wearer’s shoulder. This is the first time head pose detection has been studied in wearable robots and using images from a side angle. In this study, we used AlexNet convolutional neural network architecture trained on the images from the database for the head pose detection system. The system was tested with 250 images and resulted in an accuracy of 94.4% across five head poses, namely left, left intermediate, straight, right, and right intermediate.
ISSN:2076-3417