Event-Related Potentials Reflect Speech-Relevant Somatosensory-Auditory Interactions

An interaction between orofacial somatosensation and the perception of speech was demonstrated in recent psychophysical studies (Ito et al. 2009; Ito and Ostry 2009). To explore further the neural mechanisms of the speech-related somatosensory-auditory interaction, we assessed to what extent multise...

Full description

Bibliographic Details
Main Authors: Takayuki Ito, Vincent L Gracco, David J Ostry
Format: Article
Language:English
Published: SAGE Publishing 2011-10-01
Series:i-Perception
Online Access:https://doi.org/10.1068/ic803
id doaj-ea69ac374fd14ce4ab97c5b9aabbe24f
record_format Article
spelling doaj-ea69ac374fd14ce4ab97c5b9aabbe24f2020-11-25T03:21:38ZengSAGE Publishingi-Perception2041-66952011-10-01210.1068/ic80310.1068_ic803Event-Related Potentials Reflect Speech-Relevant Somatosensory-Auditory InteractionsTakayuki Ito0Vincent L Gracco1David J Ostry2Haskins LaboratoriesMcGill UniversityMcGill UniversityAn interaction between orofacial somatosensation and the perception of speech was demonstrated in recent psychophysical studies (Ito et al. 2009; Ito and Ostry 2009). To explore further the neural mechanisms of the speech-related somatosensory-auditory interaction, we assessed to what extent multisensory evoked potentials reflect multisensory interaction during speech perception. We also examined the dynamic modulation of multisensory integration resulting from relative timing differences between the onsets of the two sensory stimuli. We recorded event-related potentials from 64 scalp sites in response to somatosensory stimulation alone, auditory stimulation alone, and combined somatosensory and auditory stimulation. In the multisensory condition, the timing of the two stimuli was either simultaneous or offset by 90 ms (lead and lag). We found evidence of multisensory interaction with the amplitude of the multisensory evoked potential reliably different than the sum of the two unisensory potentials around the first peak of multisensory response (100–200 ms). The magnitude of the evoked potential difference varied as a function of the relative timing between the stimuli in the interval from 170 to 200 ms following somatosensory stimulation. The results demonstrate clear multisensory convergence and suggest a dynamic modulation of multisensory interaction during speech.https://doi.org/10.1068/ic803
collection DOAJ
language English
format Article
sources DOAJ
author Takayuki Ito
Vincent L Gracco
David J Ostry
spellingShingle Takayuki Ito
Vincent L Gracco
David J Ostry
Event-Related Potentials Reflect Speech-Relevant Somatosensory-Auditory Interactions
i-Perception
author_facet Takayuki Ito
Vincent L Gracco
David J Ostry
author_sort Takayuki Ito
title Event-Related Potentials Reflect Speech-Relevant Somatosensory-Auditory Interactions
title_short Event-Related Potentials Reflect Speech-Relevant Somatosensory-Auditory Interactions
title_full Event-Related Potentials Reflect Speech-Relevant Somatosensory-Auditory Interactions
title_fullStr Event-Related Potentials Reflect Speech-Relevant Somatosensory-Auditory Interactions
title_full_unstemmed Event-Related Potentials Reflect Speech-Relevant Somatosensory-Auditory Interactions
title_sort event-related potentials reflect speech-relevant somatosensory-auditory interactions
publisher SAGE Publishing
series i-Perception
issn 2041-6695
publishDate 2011-10-01
description An interaction between orofacial somatosensation and the perception of speech was demonstrated in recent psychophysical studies (Ito et al. 2009; Ito and Ostry 2009). To explore further the neural mechanisms of the speech-related somatosensory-auditory interaction, we assessed to what extent multisensory evoked potentials reflect multisensory interaction during speech perception. We also examined the dynamic modulation of multisensory integration resulting from relative timing differences between the onsets of the two sensory stimuli. We recorded event-related potentials from 64 scalp sites in response to somatosensory stimulation alone, auditory stimulation alone, and combined somatosensory and auditory stimulation. In the multisensory condition, the timing of the two stimuli was either simultaneous or offset by 90 ms (lead and lag). We found evidence of multisensory interaction with the amplitude of the multisensory evoked potential reliably different than the sum of the two unisensory potentials around the first peak of multisensory response (100–200 ms). The magnitude of the evoked potential difference varied as a function of the relative timing between the stimuli in the interval from 170 to 200 ms following somatosensory stimulation. The results demonstrate clear multisensory convergence and suggest a dynamic modulation of multisensory interaction during speech.
url https://doi.org/10.1068/ic803
work_keys_str_mv AT takayukiito eventrelatedpotentialsreflectspeechrelevantsomatosensoryauditoryinteractions
AT vincentlgracco eventrelatedpotentialsreflectspeechrelevantsomatosensoryauditoryinteractions
AT davidjostry eventrelatedpotentialsreflectspeechrelevantsomatosensoryauditoryinteractions
_version_ 1724613577078210560