Adolescents Environmental Emotion Perception by Integrating EEG and Eye Movements
Giving a robot the ability to perceive emotion in its environment can improve human-robot interaction (HRI), thereby facilitating more human-like communication. To achieve emotion recognition in different built environments for adolescents, we propose a multi-modal emotion intensity perception metho...
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2019-06-01
|
Series: | Frontiers in Neurorobotics |
Subjects: | |
Online Access: | https://www.frontiersin.org/article/10.3389/fnbot.2019.00046/full |
id |
doaj-7ca6aae437db417e8bcc7526d7e1aa5d |
---|---|
record_format |
Article |
spelling |
doaj-7ca6aae437db417e8bcc7526d7e1aa5d2020-11-24T23:52:10ZengFrontiers Media S.A.Frontiers in Neurorobotics1662-52182019-06-011310.3389/fnbot.2019.00046445967Adolescents Environmental Emotion Perception by Integrating EEG and Eye MovementsYuanyuan Su0Yuanyuan Su1Wenchao Li2Ning Bi3Zhao Lv4Zhao Lv5Department of Design, Anhui University, Hefei, ChinaCollege of Design, Iowa State University, Ames, IA, United StatesSchool of Computer Science and Technology, Anhui University, Hefei, ChinaSchool of Computer Science, Georgia Institute of Technology, Atlanta, GA, United StatesSchool of Computer Science and Technology, Anhui University, Hefei, ChinaInstitute of Physical Science and Information Technology, Anhui University, Hefei, ChinaGiving a robot the ability to perceive emotion in its environment can improve human-robot interaction (HRI), thereby facilitating more human-like communication. To achieve emotion recognition in different built environments for adolescents, we propose a multi-modal emotion intensity perception method using an integration of electroencephalography (EEG) and eye movement information. Specifically, we first develop a new stimulus video selection method based on computation of normalized arousal and valence scores according to subjective feedback from participants. Then, we establish a valence perception sub-model and an arousal sub-model by collecting and analyzing emotional EEG and eye movement signals, respectively. We employ this dual recognition method to perceive emotional intensities synchronously in two dimensions. In the laboratory environment, the best recognition accuracies of the modality fusion for the arousal and valence dimensions are 72.8 and 69.3%. The experimental results validate the feasibility of the proposed multi-modal emotion recognition method for environment emotion intensity perception. This promising tool not only achieves more accurate emotion perception for HRI systems but also provides an alternative approach to quantitatively assess environmental psychology.https://www.frontiersin.org/article/10.3389/fnbot.2019.00046/fullelectroencephalograph (EEG)eye movementshuman-robot interaction (HRI)adolescentsenvironmental emotion perception |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Yuanyuan Su Yuanyuan Su Wenchao Li Ning Bi Zhao Lv Zhao Lv |
spellingShingle |
Yuanyuan Su Yuanyuan Su Wenchao Li Ning Bi Zhao Lv Zhao Lv Adolescents Environmental Emotion Perception by Integrating EEG and Eye Movements Frontiers in Neurorobotics electroencephalograph (EEG) eye movements human-robot interaction (HRI) adolescents environmental emotion perception |
author_facet |
Yuanyuan Su Yuanyuan Su Wenchao Li Ning Bi Zhao Lv Zhao Lv |
author_sort |
Yuanyuan Su |
title |
Adolescents Environmental Emotion Perception by Integrating EEG and Eye Movements |
title_short |
Adolescents Environmental Emotion Perception by Integrating EEG and Eye Movements |
title_full |
Adolescents Environmental Emotion Perception by Integrating EEG and Eye Movements |
title_fullStr |
Adolescents Environmental Emotion Perception by Integrating EEG and Eye Movements |
title_full_unstemmed |
Adolescents Environmental Emotion Perception by Integrating EEG and Eye Movements |
title_sort |
adolescents environmental emotion perception by integrating eeg and eye movements |
publisher |
Frontiers Media S.A. |
series |
Frontiers in Neurorobotics |
issn |
1662-5218 |
publishDate |
2019-06-01 |
description |
Giving a robot the ability to perceive emotion in its environment can improve human-robot interaction (HRI), thereby facilitating more human-like communication. To achieve emotion recognition in different built environments for adolescents, we propose a multi-modal emotion intensity perception method using an integration of electroencephalography (EEG) and eye movement information. Specifically, we first develop a new stimulus video selection method based on computation of normalized arousal and valence scores according to subjective feedback from participants. Then, we establish a valence perception sub-model and an arousal sub-model by collecting and analyzing emotional EEG and eye movement signals, respectively. We employ this dual recognition method to perceive emotional intensities synchronously in two dimensions. In the laboratory environment, the best recognition accuracies of the modality fusion for the arousal and valence dimensions are 72.8 and 69.3%. The experimental results validate the feasibility of the proposed multi-modal emotion recognition method for environment emotion intensity perception. This promising tool not only achieves more accurate emotion perception for HRI systems but also provides an alternative approach to quantitatively assess environmental psychology. |
topic |
electroencephalograph (EEG) eye movements human-robot interaction (HRI) adolescents environmental emotion perception |
url |
https://www.frontiersin.org/article/10.3389/fnbot.2019.00046/full |
work_keys_str_mv |
AT yuanyuansu adolescentsenvironmentalemotionperceptionbyintegratingeegandeyemovements AT yuanyuansu adolescentsenvironmentalemotionperceptionbyintegratingeegandeyemovements AT wenchaoli adolescentsenvironmentalemotionperceptionbyintegratingeegandeyemovements AT ningbi adolescentsenvironmentalemotionperceptionbyintegratingeegandeyemovements AT zhaolv adolescentsenvironmentalemotionperceptionbyintegratingeegandeyemovements AT zhaolv adolescentsenvironmentalemotionperceptionbyintegratingeegandeyemovements |
_version_ |
1725474524307128320 |