Fusing Hand Postures and Speech Recognition for Tasks Performed by an Integrated Leg–Arm Hexapod Robot

Hand postures and speech are convenient means of communication for humans and can be used in human–robot interaction. Based on structural and functional characteristics of our integrated leg-arm hexapod robot, to perform reconnaissance and rescue tasks in public security application, a method of lin...

Full description

Bibliographic Details
Main Authors: Jing Qi, Xilun Ding, Weiwei Li, Zhonghua Han, Kun Xu
Format: Article
Language:English
Published: MDPI AG 2020-10-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/10/19/6995
id doaj-0802de3db40f454ba01105f411e52e88
record_format Article
spelling doaj-0802de3db40f454ba01105f411e52e882020-11-25T03:50:19ZengMDPI AGApplied Sciences2076-34172020-10-01106995699510.3390/app10196995Fusing Hand Postures and Speech Recognition for Tasks Performed by an Integrated Leg–Arm Hexapod RobotJing Qi0Xilun Ding1Weiwei Li2Zhonghua Han3Kun Xu4Robotics Institute, School of Mechanical Engineering and Automation, Beihang University, Beijing 100191, ChinaRobotics Institute, School of Mechanical Engineering and Automation, Beihang University, Beijing 100191, ChinaRobotics Institute, School of Mechanical Engineering and Automation, Beihang University, Beijing 100191, ChinaFirst Research Institute of the Ministry of Public Security of People’s Republic of China, Beijing 100191, ChinaRobotics Institute, School of Mechanical Engineering and Automation, Beihang University, Beijing 100191, ChinaHand postures and speech are convenient means of communication for humans and can be used in human–robot interaction. Based on structural and functional characteristics of our integrated leg-arm hexapod robot, to perform reconnaissance and rescue tasks in public security application, a method of linkage of movement and manipulation of robots is proposed based on the visual and auditory channels, and a system based on hand postures and speech recognition is described. The developed system contains: a speech module, hand posture module, fusion module, mechanical structure module, control module, path planning module and a 3D SLAM (Simultaneous Localization and Mapping) module. In this system, three modes, i.e., the hand posture mode, speech mode, and a combination of the hand posture and speech modes, are used in different situations. The hand posture mode is used for reconnaissance tasks, and the speech mode is used to query the path and control the movement and manipulation of the robot. The combination of the two modes can be used to avoid ambiguity during interaction. A semantic understanding-based task slot structure is developed by using the visual and auditory channels. In addition, a method of task planning based on answer-set programming is developed, and a system of network-based data interaction is designed to control movements of the robot using Chinese instructions remotely based on a wide area network. Experiments were carried out to verify the performance of the proposed system.https://www.mdpi.com/2076-3417/10/19/6995hand postures recognitionspeech recognitionhuman–robot interaction (HRI)hexapod robotsmanipulation
collection DOAJ
language English
format Article
sources DOAJ
author Jing Qi
Xilun Ding
Weiwei Li
Zhonghua Han
Kun Xu
spellingShingle Jing Qi
Xilun Ding
Weiwei Li
Zhonghua Han
Kun Xu
Fusing Hand Postures and Speech Recognition for Tasks Performed by an Integrated Leg–Arm Hexapod Robot
Applied Sciences
hand postures recognition
speech recognition
human–robot interaction (HRI)
hexapod robots
manipulation
author_facet Jing Qi
Xilun Ding
Weiwei Li
Zhonghua Han
Kun Xu
author_sort Jing Qi
title Fusing Hand Postures and Speech Recognition for Tasks Performed by an Integrated Leg–Arm Hexapod Robot
title_short Fusing Hand Postures and Speech Recognition for Tasks Performed by an Integrated Leg–Arm Hexapod Robot
title_full Fusing Hand Postures and Speech Recognition for Tasks Performed by an Integrated Leg–Arm Hexapod Robot
title_fullStr Fusing Hand Postures and Speech Recognition for Tasks Performed by an Integrated Leg–Arm Hexapod Robot
title_full_unstemmed Fusing Hand Postures and Speech Recognition for Tasks Performed by an Integrated Leg–Arm Hexapod Robot
title_sort fusing hand postures and speech recognition for tasks performed by an integrated leg–arm hexapod robot
publisher MDPI AG
series Applied Sciences
issn 2076-3417
publishDate 2020-10-01
description Hand postures and speech are convenient means of communication for humans and can be used in human–robot interaction. Based on structural and functional characteristics of our integrated leg-arm hexapod robot, to perform reconnaissance and rescue tasks in public security application, a method of linkage of movement and manipulation of robots is proposed based on the visual and auditory channels, and a system based on hand postures and speech recognition is described. The developed system contains: a speech module, hand posture module, fusion module, mechanical structure module, control module, path planning module and a 3D SLAM (Simultaneous Localization and Mapping) module. In this system, three modes, i.e., the hand posture mode, speech mode, and a combination of the hand posture and speech modes, are used in different situations. The hand posture mode is used for reconnaissance tasks, and the speech mode is used to query the path and control the movement and manipulation of the robot. The combination of the two modes can be used to avoid ambiguity during interaction. A semantic understanding-based task slot structure is developed by using the visual and auditory channels. In addition, a method of task planning based on answer-set programming is developed, and a system of network-based data interaction is designed to control movements of the robot using Chinese instructions remotely based on a wide area network. Experiments were carried out to verify the performance of the proposed system.
topic hand postures recognition
speech recognition
human–robot interaction (HRI)
hexapod robots
manipulation
url https://www.mdpi.com/2076-3417/10/19/6995
work_keys_str_mv AT jingqi fusinghandposturesandspeechrecognitionfortasksperformedbyanintegratedlegarmhexapodrobot
AT xilunding fusinghandposturesandspeechrecognitionfortasksperformedbyanintegratedlegarmhexapodrobot
AT weiweili fusinghandposturesandspeechrecognitionfortasksperformedbyanintegratedlegarmhexapodrobot
AT zhonghuahan fusinghandposturesandspeechrecognitionfortasksperformedbyanintegratedlegarmhexapodrobot
AT kunxu fusinghandposturesandspeechrecognitionfortasksperformedbyanintegratedlegarmhexapodrobot
_version_ 1724491039261065216