Emotion Recognition Using Deep Convolutional Neural Network with Large Scale Physiological Data

Classification of emotions plays a very important role in affective computing and has real-world applications in fields as diverse as entertainment, medical, defense, retail, and education. These applications include video games, virtual reality, pain recognition, lie detection, classification of Au...

Full description

Bibliographic Details
Main Author: Sharma, Astha
Format: Others
Published: Scholar Commons 2018
Subjects:
Online Access:https://scholarcommons.usf.edu/etd/7570
https://scholarcommons.usf.edu/cgi/viewcontent.cgi?article=8767&context=etd
Description
Summary:Classification of emotions plays a very important role in affective computing and has real-world applications in fields as diverse as entertainment, medical, defense, retail, and education. These applications include video games, virtual reality, pain recognition, lie detection, classification of Autistic Spectrum Disorder (ASD), analysis of stress levels, and determining attention levels. This vast range of applications motivated us to study automatic emotion recognition which can be done by using facial expression, speech, and physiological data. A person’s physiological signals such are heart rate, and blood pressure are deeply linked with their emotional states and can be used to identify a variety of emotions; however, they are less frequently explored for emotion recognition compared to audiovisual signals such as facial expression and voice. In this thesis, we investigate a multimodal approach to emotion recognition using physiological signals by showing how these signals can be combined and used to accurately identify a wide range of emotions such as happiness, sadness, and pain. We use the deep convolutional neural network for our experiments. We also detail comparisons between gender-specific models of emotion. Our investigation makes use of deep convolutional neural networks, which are the latest state of the art in supervised learning, on two publicly available databases, namely DEAP and BP4D+. We achieved an average emotion recognition accuracy of 98.89\% on BP4D+ and on DEAP it is 86.09\% for valence, 90.61\% for arousal, 90.48\% for liking and 90.95\% for dominance. We also compare our results to the current state of the art, showing the superior performance of our method.