Attention Optimization Method for EEG via the TGAM

Since the 21st century, noninvasive brain-computer interface (BCI) has developed rapidly, and brain-computer devices have gradually moved from the laboratory to the mass market. Among them, the TGAM (ThinkGear Asic Module) and its encapsulate algorithm have been adopted by many research teams and fa...

Full description

Bibliographic Details
Main Authors: Yu Wu, Ning Xie
Format: Article
Language:English
Published: Hindawi Limited 2020-01-01
Series:Computational and Mathematical Methods in Medicine
Online Access:http://dx.doi.org/10.1155/2020/6427305
id doaj-b6dba543c12b404c8950bbbde505c663
record_format Article
spelling doaj-b6dba543c12b404c8950bbbde505c6632020-11-25T03:05:50ZengHindawi LimitedComputational and Mathematical Methods in Medicine1748-670X1748-67182020-01-01202010.1155/2020/64273056427305Attention Optimization Method for EEG via the TGAMYu Wu0Ning Xie1Glasgow College, University of Electronic Science and Technology of China, 611731, ChinaCenter of Future Media, School of Computer Science and Engineering, University of Electronic Science and Technology of China, 611731, ChinaSince the 21st century, noninvasive brain-computer interface (BCI) has developed rapidly, and brain-computer devices have gradually moved from the laboratory to the mass market. Among them, the TGAM (ThinkGear Asic Module) and its encapsulate algorithm have been adopted by many research teams and faculty members around the world. However, due to the limited development cost, the effectiveness of the algorithm to calculate data is not satisfactory. This paper proposes an attention optimization algorithm based on the TGAM for EEG data feedback. Considering that the data output of the TGAM encapsulate algorithm fluctuates greatly, the delay is high and the accuracy is low. The experimental results demonstrated that our algorithm can optimize EEG data, so that with the same or even lower delay and without changing the encapsulate algorithm of the module itself, it can significantly improve the performance of attention data, greatly improve the stability and accuracy of data, and achieve better results in practical applications.http://dx.doi.org/10.1155/2020/6427305
collection DOAJ
language English
format Article
sources DOAJ
author Yu Wu
Ning Xie
spellingShingle Yu Wu
Ning Xie
Attention Optimization Method for EEG via the TGAM
Computational and Mathematical Methods in Medicine
author_facet Yu Wu
Ning Xie
author_sort Yu Wu
title Attention Optimization Method for EEG via the TGAM
title_short Attention Optimization Method for EEG via the TGAM
title_full Attention Optimization Method for EEG via the TGAM
title_fullStr Attention Optimization Method for EEG via the TGAM
title_full_unstemmed Attention Optimization Method for EEG via the TGAM
title_sort attention optimization method for eeg via the tgam
publisher Hindawi Limited
series Computational and Mathematical Methods in Medicine
issn 1748-670X
1748-6718
publishDate 2020-01-01
description Since the 21st century, noninvasive brain-computer interface (BCI) has developed rapidly, and brain-computer devices have gradually moved from the laboratory to the mass market. Among them, the TGAM (ThinkGear Asic Module) and its encapsulate algorithm have been adopted by many research teams and faculty members around the world. However, due to the limited development cost, the effectiveness of the algorithm to calculate data is not satisfactory. This paper proposes an attention optimization algorithm based on the TGAM for EEG data feedback. Considering that the data output of the TGAM encapsulate algorithm fluctuates greatly, the delay is high and the accuracy is low. The experimental results demonstrated that our algorithm can optimize EEG data, so that with the same or even lower delay and without changing the encapsulate algorithm of the module itself, it can significantly improve the performance of attention data, greatly improve the stability and accuracy of data, and achieve better results in practical applications.
url http://dx.doi.org/10.1155/2020/6427305
work_keys_str_mv AT yuwu attentionoptimizationmethodforeegviathetgam
AT ningxie attentionoptimizationmethodforeegviathetgam
_version_ 1715307593800351744