Emotion Recognition From Body Movement

Automatic emotion recognition from the analysis of body movement has tremendous potential to revolutionize virtual reality, robotics, behavior modeling, and biometric identity recognition domains. A computer system capable of recognizing human emotion from the body can also significantly change the...

Full description

Bibliographic Details
Main Authors: Ferdous Ahmed, A. S. M. Hossain Bari, Marina L. Gavrilova
Format: Article
Language:English
Published: IEEE 2020-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8945309/
id doaj-5fa6baf1eec34a91876cdda453dc8195
record_format Article
spelling doaj-5fa6baf1eec34a91876cdda453dc81952021-03-30T03:06:10ZengIEEEIEEE Access2169-35362020-01-018117611178110.1109/ACCESS.2019.29631138945309Emotion Recognition From Body MovementFerdous Ahmed0https://orcid.org/0000-0002-3822-2296A. S. M. Hossain Bari1https://orcid.org/0000-0003-1850-4816Marina L. Gavrilova2https://orcid.org/0000-0002-5338-1834Department of Computer Science, University of Calgary, Calgary, CanadaDepartment of Computer Science, University of Calgary, Calgary, CanadaDepartment of Computer Science, University of Calgary, Calgary, CanadaAutomatic emotion recognition from the analysis of body movement has tremendous potential to revolutionize virtual reality, robotics, behavior modeling, and biometric identity recognition domains. A computer system capable of recognizing human emotion from the body can also significantly change the way we interact with the computers. One of the significant challenges is to identify emotion-specific features from a vast number of descriptors of human body movements. In this paper, we introduce a novel two-layer feature selection framework for emotion classification from a comprehensive list of body movement features. We used the feature selection framework to accurately recognize five basic emotions: happiness, sadness, fear, anger, and neutral. In the first layer, a unique combination of Analysis of Variance (ANOVA) and Multivariate Analysis of Variance (MANOVA) was utilized to eliminate irrelevant features. In the second layer, a binary chromosome-based genetic algorithm was proposed to select a feature subset from the relevant list of features that maximizes the emotion recognition rate. Score and rank-level fusion were applied to further improve the accuracy of the system. The proposed system was validated on proprietary and public datasets, containing 30 subjects. Different action scenarios, such as walking and sitting actions, as well as an action-independent case, were considered. Based on the experimental results, the proposed emotion recognition system achieved a very high emotion recognition rate outperforming all of the state-of-the-art methods. The proposed system achieved recognition accuracy of 90.0% during walking, 96.0% during sitting, and 86.66% in an action-independent scenario, demonstrating high accuracy and robustness of the developed method.https://ieeexplore.ieee.org/document/8945309/Emotion recognitionfeature selectiongait analysisgenetic algorithminformation fusionhuman motion
collection DOAJ
language English
format Article
sources DOAJ
author Ferdous Ahmed
A. S. M. Hossain Bari
Marina L. Gavrilova
spellingShingle Ferdous Ahmed
A. S. M. Hossain Bari
Marina L. Gavrilova
Emotion Recognition From Body Movement
IEEE Access
Emotion recognition
feature selection
gait analysis
genetic algorithm
information fusion
human motion
author_facet Ferdous Ahmed
A. S. M. Hossain Bari
Marina L. Gavrilova
author_sort Ferdous Ahmed
title Emotion Recognition From Body Movement
title_short Emotion Recognition From Body Movement
title_full Emotion Recognition From Body Movement
title_fullStr Emotion Recognition From Body Movement
title_full_unstemmed Emotion Recognition From Body Movement
title_sort emotion recognition from body movement
publisher IEEE
series IEEE Access
issn 2169-3536
publishDate 2020-01-01
description Automatic emotion recognition from the analysis of body movement has tremendous potential to revolutionize virtual reality, robotics, behavior modeling, and biometric identity recognition domains. A computer system capable of recognizing human emotion from the body can also significantly change the way we interact with the computers. One of the significant challenges is to identify emotion-specific features from a vast number of descriptors of human body movements. In this paper, we introduce a novel two-layer feature selection framework for emotion classification from a comprehensive list of body movement features. We used the feature selection framework to accurately recognize five basic emotions: happiness, sadness, fear, anger, and neutral. In the first layer, a unique combination of Analysis of Variance (ANOVA) and Multivariate Analysis of Variance (MANOVA) was utilized to eliminate irrelevant features. In the second layer, a binary chromosome-based genetic algorithm was proposed to select a feature subset from the relevant list of features that maximizes the emotion recognition rate. Score and rank-level fusion were applied to further improve the accuracy of the system. The proposed system was validated on proprietary and public datasets, containing 30 subjects. Different action scenarios, such as walking and sitting actions, as well as an action-independent case, were considered. Based on the experimental results, the proposed emotion recognition system achieved a very high emotion recognition rate outperforming all of the state-of-the-art methods. The proposed system achieved recognition accuracy of 90.0% during walking, 96.0% during sitting, and 86.66% in an action-independent scenario, demonstrating high accuracy and robustness of the developed method.
topic Emotion recognition
feature selection
gait analysis
genetic algorithm
information fusion
human motion
url https://ieeexplore.ieee.org/document/8945309/
work_keys_str_mv AT ferdousahmed emotionrecognitionfrombodymovement
AT asmhossainbari emotionrecognitionfrombodymovement
AT marinalgavrilova emotionrecognitionfrombodymovement
_version_ 1724184085436301312