Audition and vision share spatial attentional resources, yet attentional load does not disrupt audiovisual integration

Humans continuously receive and integrate information from several sensory modalities. However, attentional resources limit the amount of information that can be processed. It is not yet clear how attentional resources and multisensory processing are interrelated. Specifically, the following questio...

Full description

Bibliographic Details
Main Authors: Basil eWahn, Peter eKönig
Format: Article
Language:English
Published: Frontiers Media S.A. 2015-07-01
Series:Frontiers in Psychology
Subjects:
Online Access:http://journal.frontiersin.org/Journal/10.3389/fpsyg.2015.01084/full
Description
Summary:Humans continuously receive and integrate information from several sensory modalities. However, attentional resources limit the amount of information that can be processed. It is not yet clear how attentional resources and multisensory processing are interrelated. Specifically, the following questions arise: 1) Are there distinct spatial attentional resources for each sensory modality? and 2) Does attentional load affect multisensory integration? We investigated these questions using a dual task paradigm: Participants performed two spatial tasks (a multiple object tracking task and a localization task), either separately (single task condition) or simultaneously (dual task condition). In the multiple object tracking task, participants visually tracked a small subset of several randomly moving objects. In the localization task, participants received either visual, auditory, or redundant visual and auditory location cues. In the dual task condition, we found a substantial decrease in participants’ performance relative to the results of the single task condition. Importantly, participants performed equally well in the dual task condition regardless of the location cues’ modality. This result suggests that having spatial information coming from different modalities does not facilitate performance, thereby indicating shared spatial attentional resources for the auditory and visual modality. Furthermore, we found that participants integrated redundant multisensory information similarly even when they experienced additional attentional load in the dual task condition. Overall, findings suggest that 1) visual andauditory spatial attentional resources are shared and that 2) audiovisual integration of spatial information occurs in an pre-attentive processing stage.
ISSN:1664-1078