Temporal structure in audiovisual sensory selection.
In natural environments, sensory information is embedded in temporally contiguous streams of events. This is typically the case when seeing and listening to a speaker or when engaged in scene analysis. In such contexts, two mechanisms are needed to single out and build a reliable representation of a...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
Public Library of Science (PLoS)
2012-01-01
|
Series: | PLoS ONE |
Online Access: | http://europepmc.org/articles/PMC3400621?pdf=render |
id |
doaj-86c3fc082b224f67879ec3af43b80d1b |
---|---|
record_format |
Article |
spelling |
doaj-86c3fc082b224f67879ec3af43b80d1b2020-11-25T00:12:13ZengPublic Library of Science (PLoS)PLoS ONE1932-62032012-01-0177e4093610.1371/journal.pone.0040936Temporal structure in audiovisual sensory selection.Anne KösemVirginie van WassenhoveIn natural environments, sensory information is embedded in temporally contiguous streams of events. This is typically the case when seeing and listening to a speaker or when engaged in scene analysis. In such contexts, two mechanisms are needed to single out and build a reliable representation of an event (or object): the temporal parsing of information and the selection of relevant information in the stream. It has previously been shown that rhythmic events naturally build temporal expectations that improve sensory processing at predictable points in time. Here, we asked to which extent temporal regularities can improve the detection and identification of events across sensory modalities. To do so, we used a dynamic visual conjunction search task accompanied by auditory cues synchronized or not with the color change of the target (horizontal or vertical bar). Sounds synchronized with the visual target improved search efficiency for temporal rates below 1.4 Hz but did not affect efficiency above that stimulation rate. Desynchronized auditory cues consistently impaired visual search below 3.3 Hz. Our results are interpreted in the context of the Dynamic Attending Theory: specifically, we suggest that a cognitive operation structures events in time irrespective of the sensory modality of input. Our results further support and specify recent neurophysiological findings by showing strong temporal selectivity for audiovisual integration in the auditory-driven improvement of visual search efficiency.http://europepmc.org/articles/PMC3400621?pdf=render |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Anne Kösem Virginie van Wassenhove |
spellingShingle |
Anne Kösem Virginie van Wassenhove Temporal structure in audiovisual sensory selection. PLoS ONE |
author_facet |
Anne Kösem Virginie van Wassenhove |
author_sort |
Anne Kösem |
title |
Temporal structure in audiovisual sensory selection. |
title_short |
Temporal structure in audiovisual sensory selection. |
title_full |
Temporal structure in audiovisual sensory selection. |
title_fullStr |
Temporal structure in audiovisual sensory selection. |
title_full_unstemmed |
Temporal structure in audiovisual sensory selection. |
title_sort |
temporal structure in audiovisual sensory selection. |
publisher |
Public Library of Science (PLoS) |
series |
PLoS ONE |
issn |
1932-6203 |
publishDate |
2012-01-01 |
description |
In natural environments, sensory information is embedded in temporally contiguous streams of events. This is typically the case when seeing and listening to a speaker or when engaged in scene analysis. In such contexts, two mechanisms are needed to single out and build a reliable representation of an event (or object): the temporal parsing of information and the selection of relevant information in the stream. It has previously been shown that rhythmic events naturally build temporal expectations that improve sensory processing at predictable points in time. Here, we asked to which extent temporal regularities can improve the detection and identification of events across sensory modalities. To do so, we used a dynamic visual conjunction search task accompanied by auditory cues synchronized or not with the color change of the target (horizontal or vertical bar). Sounds synchronized with the visual target improved search efficiency for temporal rates below 1.4 Hz but did not affect efficiency above that stimulation rate. Desynchronized auditory cues consistently impaired visual search below 3.3 Hz. Our results are interpreted in the context of the Dynamic Attending Theory: specifically, we suggest that a cognitive operation structures events in time irrespective of the sensory modality of input. Our results further support and specify recent neurophysiological findings by showing strong temporal selectivity for audiovisual integration in the auditory-driven improvement of visual search efficiency. |
url |
http://europepmc.org/articles/PMC3400621?pdf=render |
work_keys_str_mv |
AT annekosem temporalstructureinaudiovisualsensoryselection AT virginievanwassenhove temporalstructureinaudiovisualsensoryselection |
_version_ |
1725400486180290560 |