Investigating the Use of Pretrained Convolutional Neural Network on Cross-Subject and Cross-Dataset EEG Emotion Recognition

The electroencephalogram (EEG) has great attraction in emotion recognition studies due to its resistance to deceptive actions of humans. This is one of the most significant advantages of brain signals in comparison to visual or speech signals in the emotion recognition context. A major challenge in...

Full description

Bibliographic Details
Main Authors: Yucel Cimtay, Erhan Ekmekcioglu
Format: Article
Language:English
Published: MDPI AG 2020-04-01
Series:Sensors
Subjects:
EEG
Online Access:https://www.mdpi.com/1424-8220/20/7/2034
id doaj-8f1521d1587b4cbf83888b3b372a14f2
record_format Article
spelling doaj-8f1521d1587b4cbf83888b3b372a14f22020-11-25T03:10:56ZengMDPI AGSensors1424-82202020-04-01202034203410.3390/s20072034Investigating the Use of Pretrained Convolutional Neural Network on Cross-Subject and Cross-Dataset EEG Emotion RecognitionYucel Cimtay0Erhan Ekmekcioglu1Institute for Digital Technologies, Loughborough University London, London E20 3BS, UKInstitute for Digital Technologies, Loughborough University London, London E20 3BS, UKThe electroencephalogram (EEG) has great attraction in emotion recognition studies due to its resistance to deceptive actions of humans. This is one of the most significant advantages of brain signals in comparison to visual or speech signals in the emotion recognition context. A major challenge in EEG-based emotion recognition is that EEG recordings exhibit varying distributions for different people as well as for the same person at different time instances. This nonstationary nature of EEG limits the accuracy of it when subject independency is the priority. The aim of this study is to increase the subject-independent recognition accuracy by exploiting pretrained state-of-the-art Convolutional Neural Network (CNN) architectures. Unlike similar studies that extract spectral band power features from the EEG readings, raw EEG data is used in our study after applying windowing, pre-adjustments and normalization. Removing manual feature extraction from the training system overcomes the risk of eliminating hidden features in the raw data and helps leverage the deep neural network’s power in uncovering unknown features. To improve the classification accuracy further, a median filter is used to eliminate the false detections along a prediction interval of emotions. This method yields a mean cross-subject accuracy of 86.56% and 78.34% on the Shanghai Jiao Tong University Emotion EEG Dataset (SEED) for two and three emotion classes, respectively. It also yields a mean cross-subject accuracy of 72.81% on the Database for Emotion Analysis using Physiological Signals (DEAP) and 81.8% on the Loughborough University Multimodal Emotion Dataset (LUMED) for two emotion classes. Furthermore, the recognition model that has been trained using the SEED dataset was tested with the DEAP dataset, which yields a mean prediction accuracy of 58.1% across all subjects and emotion classes. Results show that in terms of classification accuracy, the proposed approach is superior to, or on par with, the reference subject-independent EEG emotion recognition studies identified in literature and has limited complexity due to the elimination of the need for feature extraction.https://www.mdpi.com/1424-8220/20/7/2034EEGemotion recognitionpretrained modelsconvolutional neural networkdense layersubject independency
collection DOAJ
language English
format Article
sources DOAJ
author Yucel Cimtay
Erhan Ekmekcioglu
spellingShingle Yucel Cimtay
Erhan Ekmekcioglu
Investigating the Use of Pretrained Convolutional Neural Network on Cross-Subject and Cross-Dataset EEG Emotion Recognition
Sensors
EEG
emotion recognition
pretrained models
convolutional neural network
dense layer
subject independency
author_facet Yucel Cimtay
Erhan Ekmekcioglu
author_sort Yucel Cimtay
title Investigating the Use of Pretrained Convolutional Neural Network on Cross-Subject and Cross-Dataset EEG Emotion Recognition
title_short Investigating the Use of Pretrained Convolutional Neural Network on Cross-Subject and Cross-Dataset EEG Emotion Recognition
title_full Investigating the Use of Pretrained Convolutional Neural Network on Cross-Subject and Cross-Dataset EEG Emotion Recognition
title_fullStr Investigating the Use of Pretrained Convolutional Neural Network on Cross-Subject and Cross-Dataset EEG Emotion Recognition
title_full_unstemmed Investigating the Use of Pretrained Convolutional Neural Network on Cross-Subject and Cross-Dataset EEG Emotion Recognition
title_sort investigating the use of pretrained convolutional neural network on cross-subject and cross-dataset eeg emotion recognition
publisher MDPI AG
series Sensors
issn 1424-8220
publishDate 2020-04-01
description The electroencephalogram (EEG) has great attraction in emotion recognition studies due to its resistance to deceptive actions of humans. This is one of the most significant advantages of brain signals in comparison to visual or speech signals in the emotion recognition context. A major challenge in EEG-based emotion recognition is that EEG recordings exhibit varying distributions for different people as well as for the same person at different time instances. This nonstationary nature of EEG limits the accuracy of it when subject independency is the priority. The aim of this study is to increase the subject-independent recognition accuracy by exploiting pretrained state-of-the-art Convolutional Neural Network (CNN) architectures. Unlike similar studies that extract spectral band power features from the EEG readings, raw EEG data is used in our study after applying windowing, pre-adjustments and normalization. Removing manual feature extraction from the training system overcomes the risk of eliminating hidden features in the raw data and helps leverage the deep neural network’s power in uncovering unknown features. To improve the classification accuracy further, a median filter is used to eliminate the false detections along a prediction interval of emotions. This method yields a mean cross-subject accuracy of 86.56% and 78.34% on the Shanghai Jiao Tong University Emotion EEG Dataset (SEED) for two and three emotion classes, respectively. It also yields a mean cross-subject accuracy of 72.81% on the Database for Emotion Analysis using Physiological Signals (DEAP) and 81.8% on the Loughborough University Multimodal Emotion Dataset (LUMED) for two emotion classes. Furthermore, the recognition model that has been trained using the SEED dataset was tested with the DEAP dataset, which yields a mean prediction accuracy of 58.1% across all subjects and emotion classes. Results show that in terms of classification accuracy, the proposed approach is superior to, or on par with, the reference subject-independent EEG emotion recognition studies identified in literature and has limited complexity due to the elimination of the need for feature extraction.
topic EEG
emotion recognition
pretrained models
convolutional neural network
dense layer
subject independency
url https://www.mdpi.com/1424-8220/20/7/2034
work_keys_str_mv AT yucelcimtay investigatingtheuseofpretrainedconvolutionalneuralnetworkoncrosssubjectandcrossdataseteegemotionrecognition
AT erhanekmekcioglu investigatingtheuseofpretrainedconvolutionalneuralnetworkoncrosssubjectandcrossdataseteegemotionrecognition
_version_ 1724656327036239872