Early modality-specific somatosensory cortical regions are modulated by attended visual stimuli: interaction of vision, touch and behavioral intent.

Bimodal interactions between relevant visual and tactile inputs can facilitate attentional modulation at early stages in somatosensory cortices to achieve goal-oriented behaviors. However, the specific contribution of each sensory system during attentional processing and, importantly, how these inte...

Full description

Bibliographic Details
Main Authors: W. Richard Staines, Christina ePopovich, Jennifer K Dionne, Meaghan S Adams
Format: Article
Language:English
Published: Frontiers Media S.A. 2014-04-01
Series:Frontiers in Psychology
Subjects:
Online Access:http://journal.frontiersin.org/Journal/10.3389/fpsyg.2014.00351/full
Description
Summary:Bimodal interactions between relevant visual and tactile inputs can facilitate attentional modulation at early stages in somatosensory cortices to achieve goal-oriented behaviors. However, the specific contribution of each sensory system during attentional processing and, importantly, how these interact with the required behavioural motor goals remains unclear. Here we used EEG and event-related potentials (ERPs) to test the hypothesis that activity from modality-specific somatosensory cortical regions would be enhanced with task-relevant bimodal (visual-tactile) stimuli and that the degree of modulation would depend on the difficulty of the associated sensory-motor task demands. Tactile stimuli were discrete vibrations to the index finger and visual stimuli were horizontal bars on a computer screen, both with random amplitudes. Streams of unimodal (tactile) and crossmodal (visual and tactile) stimuli were randomly presented and participants were instructed to attend to one type of stimulus (unimodal or crossmodal) and responses involved either an indication of the presence of an attended stimulus (detect), or the integration and summation of 2 stimulus amplitudes using a pressure-sensitive ball (grade). Force-amplitude associations were learned in a training session, and no feedback was provided during the task. ERPs were time-locked to tactile stimuli and extracted for early modality-specific components (P50, P100, N140). The P50 was enhanced with bimodal (visual-tactile) stimuli that were attended to. This was maximal when the motor requirements involved integration of the 2 stimuli in the grade task and when the visual stimulus occurred before (100 ms) the tactile stimulus. These results suggest that visual information relevant for movement modulates early somatosensory processing and that the motor behavioral context influences this likely through interaction of top-down attentional and motor preparatory systems with more bottom-up crossmodal influences.
ISSN:1664-1078