Deep Neural Network for Visual Stimulus-Based Reaction Time Estimation Using the Periodogram of Single-Trial EEG

Multiplexed deep neural networks (DNN) have engendered high-performance predictive models gaining popularity for decoding brain waves, extensively collected in the form of electroencephalogram (EEG) signals. In this paper, to the best of our knowledge, we introduce a first-ever DNN-based generalized...

Full description

Bibliographic Details
Main Authors: Mohammad Samin Nur Chowdhury, Arindam Dutta, Matthew Kyle Robison, Chris Blais, Gene Arnold Brewer, Daniel Wesley Bliss
Format: Article
Language:English
Published: MDPI AG 2020-10-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/20/21/6090
Description
Summary:Multiplexed deep neural networks (DNN) have engendered high-performance predictive models gaining popularity for decoding brain waves, extensively collected in the form of electroencephalogram (EEG) signals. In this paper, to the best of our knowledge, we introduce a first-ever DNN-based generalized approach to estimate reaction time (RT) using the periodogram representation of single-trial EEG in a visual stimulus-response experiment with 48 participants. We have designed a Fully Connected Neural Network (FCNN) and a Convolutional Neural Network (CNN) to predict and classify RTs for each trial. Though deep neural networks are widely known for classification applications, cascading FCNN/CNN with the Random Forest model, we designed a robust regression-based estimator to predict RT. With the FCNN model, the accuracies obtained for binary and 3-class classification were 93% and 76%, respectively, which further improved with the use of CNN (94% and 78%, respectively). The regression-based approach predicted RTs with correlation coefficients (CC) of 0.78 and 0.80 for FCNN and CNN, respectively. Investigating further, we found that the left central as well as parietal and occipital lobes were crucial for predicting RT, with significant activities in the <i>theta</i> and <i>alpha</i> frequency bands.
ISSN:1424-8220