Comparing Neural Correlates of Human Emotions across Multiple Stimulus Presentation Paradigms

Most electroencephalography (EEG)-based emotion recognition systems rely on a single stimulus to evoke emotions. These systems make use of videos, sounds, and images as stimuli. Few studies have been found for self-induced emotions. The question “if different stimulus presentation paradigms for same...

Full description

Bibliographic Details
Main Authors: Naveen Masood, Humera Farooq
Format: Article
Language:English
Published: MDPI AG 2021-05-01
Series:Brain Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3425/11/6/696
id doaj-90d4ce551dea41b4a609fd7517a10c64
record_format Article
spelling doaj-90d4ce551dea41b4a609fd7517a10c642021-06-01T01:05:10ZengMDPI AGBrain Sciences2076-34252021-05-011169669610.3390/brainsci11060696Comparing Neural Correlates of Human Emotions across Multiple Stimulus Presentation ParadigmsNaveen Masood0Humera Farooq1Electrical Engineering Department, Bahria University, Karachi 75260, PakistanComputer Science Department, Bahria University, Karachi 44000, PakistanMost electroencephalography (EEG)-based emotion recognition systems rely on a single stimulus to evoke emotions. These systems make use of videos, sounds, and images as stimuli. Few studies have been found for self-induced emotions. The question “if different stimulus presentation paradigms for same emotion, produce any subject and stimulus independent neural correlates” remains unanswered. Furthermore, we found that there are publicly available datasets that are used in a large number of studies targeting EEG-based human emotional state recognition. Since one of the major concerns and contributions of this work is towards classifying emotions while subjects experience different stimulus-presentation paradigms, we need to perform new experiments. This paper presents a novel experimental study that recorded EEG data for three different human emotional states evoked with four different stimuli presentation paradigms. Fear, neutral, and joy have been considered as three emotional states. In this work, features were extracted with common spatial pattern (CSP) from recorded EEG data and classified through linear discriminant analysis (LDA). The considered emotion-evoking paradigms included emotional imagery, pictures, sounds, and audio–video movie clips. Experiments were conducted with twenty-five participants. Classification performance in different paradigms was evaluated, considering different spectral bands. With a few exceptions, all paradigms showed the best emotion recognition for higher frequency spectral ranges. Interestingly, joy emotions were classified more strongly as compared to fear. The average neural patterns for fear vs. joy emotional states are presented with topographical maps based on spatial filters obtained with CSP for averaged band power changes for all four paradigms. With respect to the spectral bands, beta and alpha oscillation responses produced the highest number of significant results for the paradigms under consideration. With respect to brain region, the frontal lobe produced the most significant results irrespective of paradigms and spectral bands. The temporal site also played an effective role in generating statistically significant findings. To the best of our knowledge, no study has been conducted for EEG emotion recognition while considering four different stimuli paradigms. This work provides a good contribution towards designing EEG-based system for human emotion recognition that could work effectively in different real-time scenarios.https://www.mdpi.com/2076-3425/11/6/696classificationcommon spatial pattern (CSP)electroencephalography (EEG)emotional imageryemotionsfeature extraction
collection DOAJ
language English
format Article
sources DOAJ
author Naveen Masood
Humera Farooq
spellingShingle Naveen Masood
Humera Farooq
Comparing Neural Correlates of Human Emotions across Multiple Stimulus Presentation Paradigms
Brain Sciences
classification
common spatial pattern (CSP)
electroencephalography (EEG)
emotional imagery
emotions
feature extraction
author_facet Naveen Masood
Humera Farooq
author_sort Naveen Masood
title Comparing Neural Correlates of Human Emotions across Multiple Stimulus Presentation Paradigms
title_short Comparing Neural Correlates of Human Emotions across Multiple Stimulus Presentation Paradigms
title_full Comparing Neural Correlates of Human Emotions across Multiple Stimulus Presentation Paradigms
title_fullStr Comparing Neural Correlates of Human Emotions across Multiple Stimulus Presentation Paradigms
title_full_unstemmed Comparing Neural Correlates of Human Emotions across Multiple Stimulus Presentation Paradigms
title_sort comparing neural correlates of human emotions across multiple stimulus presentation paradigms
publisher MDPI AG
series Brain Sciences
issn 2076-3425
publishDate 2021-05-01
description Most electroencephalography (EEG)-based emotion recognition systems rely on a single stimulus to evoke emotions. These systems make use of videos, sounds, and images as stimuli. Few studies have been found for self-induced emotions. The question “if different stimulus presentation paradigms for same emotion, produce any subject and stimulus independent neural correlates” remains unanswered. Furthermore, we found that there are publicly available datasets that are used in a large number of studies targeting EEG-based human emotional state recognition. Since one of the major concerns and contributions of this work is towards classifying emotions while subjects experience different stimulus-presentation paradigms, we need to perform new experiments. This paper presents a novel experimental study that recorded EEG data for three different human emotional states evoked with four different stimuli presentation paradigms. Fear, neutral, and joy have been considered as three emotional states. In this work, features were extracted with common spatial pattern (CSP) from recorded EEG data and classified through linear discriminant analysis (LDA). The considered emotion-evoking paradigms included emotional imagery, pictures, sounds, and audio–video movie clips. Experiments were conducted with twenty-five participants. Classification performance in different paradigms was evaluated, considering different spectral bands. With a few exceptions, all paradigms showed the best emotion recognition for higher frequency spectral ranges. Interestingly, joy emotions were classified more strongly as compared to fear. The average neural patterns for fear vs. joy emotional states are presented with topographical maps based on spatial filters obtained with CSP for averaged band power changes for all four paradigms. With respect to the spectral bands, beta and alpha oscillation responses produced the highest number of significant results for the paradigms under consideration. With respect to brain region, the frontal lobe produced the most significant results irrespective of paradigms and spectral bands. The temporal site also played an effective role in generating statistically significant findings. To the best of our knowledge, no study has been conducted for EEG emotion recognition while considering four different stimuli paradigms. This work provides a good contribution towards designing EEG-based system for human emotion recognition that could work effectively in different real-time scenarios.
topic classification
common spatial pattern (CSP)
electroencephalography (EEG)
emotional imagery
emotions
feature extraction
url https://www.mdpi.com/2076-3425/11/6/696
work_keys_str_mv AT naveenmasood comparingneuralcorrelatesofhumanemotionsacrossmultiplestimuluspresentationparadigms
AT humerafarooq comparingneuralcorrelatesofhumanemotionsacrossmultiplestimuluspresentationparadigms
_version_ 1721413139354877952