Exploring combinations of auditory and visual stimuli for gaze-independent brain-computer interfaces.
For Brain-Computer Interface (BCI) systems that are designed for users with severe impairments of the oculomotor system, an appropriate mode of presenting stimuli to the user is crucial. To investigate whether multi-sensory integration can be exploited in the gaze-independent event-related potential...
Main Authors: | Xingwei An, Johannes Höhne, Dong Ming, Benjamin Blankertz |
---|---|
Format: | Article |
Language: | English |
Published: |
Public Library of Science (PLoS)
2014-01-01
|
Series: | PLoS ONE |
Online Access: | http://europepmc.org/articles/PMC4211702?pdf=render |
Similar Items
-
Correction: Exploring Combinations of Auditory and Visual Stimuli for Gaze-Independent Brain-Computer Interfaces.
by: PLOS ONE Staff
Published: (2016-01-01) -
Exploring combinations of different color and facial expression stimuli for gaze-independent BCIs
by: Long eChen, et al.
Published: (2016-01-01) -
Affective Stimuli for an Auditory P300 Brain-Computer Interface
by: Akinari Onishi, et al.
Published: (2017-09-01) -
Towards user-friendly spelling with an auditory brain-computer interface: the CharStreamer paradigm.
by: Johannes Höhne, et al.
Published: (2014-01-01) -
A new auditory multi-class brain-computer interface paradigm: spatial hearing as an informative cue.
by: Martijn Schreuder, et al.
Published: (2010-01-01)