Covert Intention to Answer “Yes” or “No” Can Be Decoded from Single-Trial Electroencephalograms (EEGs)

Interpersonal communication is based on questions and answers, and the most useful and simplest case is the binary “yes or no” question and answer. The purpose of this study is to show that it is possible to decode intentions on “yes” or “no” answers from multichannel single-trial electroencephalogr...

Full description

Bibliographic Details
Main Authors: Jeong Woo Choi, Kyung Hwan Kim
Format: Article
Language:English
Published: Hindawi Limited 2019-01-01
Series:Computational Intelligence and Neuroscience
Online Access:http://dx.doi.org/10.1155/2019/4259369
id doaj-1b7e62dff7ac4919bb69711516c293a8
record_format Article
spelling doaj-1b7e62dff7ac4919bb69711516c293a82020-11-25T02:47:01ZengHindawi LimitedComputational Intelligence and Neuroscience1687-52651687-52732019-01-01201910.1155/2019/42593694259369Covert Intention to Answer “Yes” or “No” Can Be Decoded from Single-Trial Electroencephalograms (EEGs)Jeong Woo Choi0Kyung Hwan Kim1Department of Biomedical Engineering, Yonsei University, Wonju 26493, Republic of KoreaDepartment of Biomedical Engineering, Yonsei University, Wonju 26493, Republic of KoreaInterpersonal communication is based on questions and answers, and the most useful and simplest case is the binary “yes or no” question and answer. The purpose of this study is to show that it is possible to decode intentions on “yes” or “no” answers from multichannel single-trial electroencephalograms, which were recorded while covertly answering to self-referential questions with either “yes” or “no.” The intention decoding algorithm consists of a common spatial pattern and support vector machine, which are employed for the feature extraction and pattern classification, respectively, after dividing the overall time-frequency range into subwindows of 200 ms × 2 Hz. The decoding accuracy using the information within each subwindow was investigated to find useful temporal and spectral ranges and found to be the highest for 800–1200 ms in the alpha band or 200–400 ms in the theta band. When the features from multiple subwindows were utilized together, the accuracy was significantly increased up to ∼86%. The most useful features for the “yes/no” discrimination was found to be focused in the right frontal region in the theta band and right centroparietal region in the alpha band, which may reflect the violation of autobiographic facts and higher cognitive load for “no” compared to “yes.” Our task requires the subjects to answer self-referential questions just as in interpersonal conversation without any self-regulation of the brain signals or high cognitive efforts, and the “yes” and “no” answers are decoded directly from the brain activities. This implies that the “mind reading” in a true sense is feasible. Beyond its contribution in fundamental understanding of the neural mechanism of human intention, the decoding of “yes” or “no” from brain activities may eventually lead to a natural brain-computer interface.http://dx.doi.org/10.1155/2019/4259369
collection DOAJ
language English
format Article
sources DOAJ
author Jeong Woo Choi
Kyung Hwan Kim
spellingShingle Jeong Woo Choi
Kyung Hwan Kim
Covert Intention to Answer “Yes” or “No” Can Be Decoded from Single-Trial Electroencephalograms (EEGs)
Computational Intelligence and Neuroscience
author_facet Jeong Woo Choi
Kyung Hwan Kim
author_sort Jeong Woo Choi
title Covert Intention to Answer “Yes” or “No” Can Be Decoded from Single-Trial Electroencephalograms (EEGs)
title_short Covert Intention to Answer “Yes” or “No” Can Be Decoded from Single-Trial Electroencephalograms (EEGs)
title_full Covert Intention to Answer “Yes” or “No” Can Be Decoded from Single-Trial Electroencephalograms (EEGs)
title_fullStr Covert Intention to Answer “Yes” or “No” Can Be Decoded from Single-Trial Electroencephalograms (EEGs)
title_full_unstemmed Covert Intention to Answer “Yes” or “No” Can Be Decoded from Single-Trial Electroencephalograms (EEGs)
title_sort covert intention to answer “yes” or “no” can be decoded from single-trial electroencephalograms (eegs)
publisher Hindawi Limited
series Computational Intelligence and Neuroscience
issn 1687-5265
1687-5273
publishDate 2019-01-01
description Interpersonal communication is based on questions and answers, and the most useful and simplest case is the binary “yes or no” question and answer. The purpose of this study is to show that it is possible to decode intentions on “yes” or “no” answers from multichannel single-trial electroencephalograms, which were recorded while covertly answering to self-referential questions with either “yes” or “no.” The intention decoding algorithm consists of a common spatial pattern and support vector machine, which are employed for the feature extraction and pattern classification, respectively, after dividing the overall time-frequency range into subwindows of 200 ms × 2 Hz. The decoding accuracy using the information within each subwindow was investigated to find useful temporal and spectral ranges and found to be the highest for 800–1200 ms in the alpha band or 200–400 ms in the theta band. When the features from multiple subwindows were utilized together, the accuracy was significantly increased up to ∼86%. The most useful features for the “yes/no” discrimination was found to be focused in the right frontal region in the theta band and right centroparietal region in the alpha band, which may reflect the violation of autobiographic facts and higher cognitive load for “no” compared to “yes.” Our task requires the subjects to answer self-referential questions just as in interpersonal conversation without any self-regulation of the brain signals or high cognitive efforts, and the “yes” and “no” answers are decoded directly from the brain activities. This implies that the “mind reading” in a true sense is feasible. Beyond its contribution in fundamental understanding of the neural mechanism of human intention, the decoding of “yes” or “no” from brain activities may eventually lead to a natural brain-computer interface.
url http://dx.doi.org/10.1155/2019/4259369
work_keys_str_mv AT jeongwoochoi covertintentiontoansweryesornocanbedecodedfromsingletrialelectroencephalogramseegs
AT kyunghwankim covertintentiontoansweryesornocanbedecodedfromsingletrialelectroencephalogramseegs
_version_ 1724755248234364928