Auditory Stimulus Detection Partially Depends on Visuospatial Attentional Resources

Humans’ ability to detect relevant sensory information while being engaged in a demanding task is crucial in daily life. Yet, limited attentional resources restrict information processing. To date, it is still debated whether there are distinct pools of attentional resources for each sensory modalit...

Full description

Bibliographic Details
Main Authors: Basil Wahn, Supriya Murali, Scott Sinnett, Peter König
Format: Article
Language:English
Published: SAGE Publishing 2017-01-01
Series:i-Perception
Online Access:https://doi.org/10.1177/2041669516688026
id doaj-595e3edc60c34229af27045aeabc1190
record_format Article
spelling doaj-595e3edc60c34229af27045aeabc11902020-11-25T03:24:38ZengSAGE Publishingi-Perception2041-66952017-01-01810.1177/204166951668802610.1177_2041669516688026Auditory Stimulus Detection Partially Depends on Visuospatial Attentional ResourcesBasil WahnSupriya MuraliScott SinnettPeter KönigHumans’ ability to detect relevant sensory information while being engaged in a demanding task is crucial in daily life. Yet, limited attentional resources restrict information processing. To date, it is still debated whether there are distinct pools of attentional resources for each sensory modality and to what extent the process of multisensory integration is dependent on attentional resources. We addressed these two questions using a dual task paradigm. Specifically, participants performed a multiple object tracking task and a detection task either separately or simultaneously. In the detection task, participants were required to detect visual, auditory, or audiovisual stimuli at varying stimulus intensities that were adjusted using a staircase procedure. We found that tasks significantly interfered. However, the interference was about 50% lower when tasks were performed in separate sensory modalities than in the same sensory modality, suggesting that attentional resources are partly shared. Moreover, we found that perceptual sensitivities were significantly improved for audiovisual stimuli relative to unisensory stimuli regardless of whether attentional resources were diverted to the multiple object tracking task or not. Overall, the present study supports the view that attentional resource allocation in multisensory processing is task-dependent and suggests that multisensory benefits are not dependent on attentional resources.https://doi.org/10.1177/2041669516688026
collection DOAJ
language English
format Article
sources DOAJ
author Basil Wahn
Supriya Murali
Scott Sinnett
Peter König
spellingShingle Basil Wahn
Supriya Murali
Scott Sinnett
Peter König
Auditory Stimulus Detection Partially Depends on Visuospatial Attentional Resources
i-Perception
author_facet Basil Wahn
Supriya Murali
Scott Sinnett
Peter König
author_sort Basil Wahn
title Auditory Stimulus Detection Partially Depends on Visuospatial Attentional Resources
title_short Auditory Stimulus Detection Partially Depends on Visuospatial Attentional Resources
title_full Auditory Stimulus Detection Partially Depends on Visuospatial Attentional Resources
title_fullStr Auditory Stimulus Detection Partially Depends on Visuospatial Attentional Resources
title_full_unstemmed Auditory Stimulus Detection Partially Depends on Visuospatial Attentional Resources
title_sort auditory stimulus detection partially depends on visuospatial attentional resources
publisher SAGE Publishing
series i-Perception
issn 2041-6695
publishDate 2017-01-01
description Humans’ ability to detect relevant sensory information while being engaged in a demanding task is crucial in daily life. Yet, limited attentional resources restrict information processing. To date, it is still debated whether there are distinct pools of attentional resources for each sensory modality and to what extent the process of multisensory integration is dependent on attentional resources. We addressed these two questions using a dual task paradigm. Specifically, participants performed a multiple object tracking task and a detection task either separately or simultaneously. In the detection task, participants were required to detect visual, auditory, or audiovisual stimuli at varying stimulus intensities that were adjusted using a staircase procedure. We found that tasks significantly interfered. However, the interference was about 50% lower when tasks were performed in separate sensory modalities than in the same sensory modality, suggesting that attentional resources are partly shared. Moreover, we found that perceptual sensitivities were significantly improved for audiovisual stimuli relative to unisensory stimuli regardless of whether attentional resources were diverted to the multiple object tracking task or not. Overall, the present study supports the view that attentional resource allocation in multisensory processing is task-dependent and suggests that multisensory benefits are not dependent on attentional resources.
url https://doi.org/10.1177/2041669516688026
work_keys_str_mv AT basilwahn auditorystimulusdetectionpartiallydependsonvisuospatialattentionalresources
AT supriyamurali auditorystimulusdetectionpartiallydependsonvisuospatialattentionalresources
AT scottsinnett auditorystimulusdetectionpartiallydependsonvisuospatialattentionalresources
AT peterkonig auditorystimulusdetectionpartiallydependsonvisuospatialattentionalresources
_version_ 1724600988601417728