Looking at the ventriloquist: visual outcome of eye movements calibrates sound localization.

A general problem in learning is how the brain determines what lesson to learn (and what lessons not to learn). For example, sound localization is a behavior that is partially learned with the aid of vision. This process requires correctly matching a visual location to that of a sound. This is an in...

Full description

Bibliographic Details
Main Authors: Daniel S Pages, Jennifer M Groh
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2013-01-01
Series:PLoS ONE
Online Access:https://www.ncbi.nlm.nih.gov/pmc/articles/pmid/24009691/?tool=EBI
id doaj-8695e19a37ec4012b8f7dffa222ea410
record_format Article
spelling doaj-8695e19a37ec4012b8f7dffa222ea4102021-03-04T12:06:14ZengPublic Library of Science (PLoS)PLoS ONE1932-62032013-01-0188e7256210.1371/journal.pone.0072562Looking at the ventriloquist: visual outcome of eye movements calibrates sound localization.Daniel S PagesJennifer M GrohA general problem in learning is how the brain determines what lesson to learn (and what lessons not to learn). For example, sound localization is a behavior that is partially learned with the aid of vision. This process requires correctly matching a visual location to that of a sound. This is an intrinsically circular problem when sound location is itself uncertain and the visual scene is rife with possible visual matches. Here, we develop a simple paradigm using visual guidance of sound localization to gain insight into how the brain confronts this type of circularity. We tested two competing hypotheses. 1: The brain guides sound location learning based on the synchrony or simultaneity of auditory-visual stimuli, potentially involving a Hebbian associative mechanism. 2: The brain uses a 'guess and check' heuristic in which visual feedback that is obtained after an eye movement to a sound alters future performance, perhaps by recruiting the brain's reward-related circuitry. We assessed the effects of exposure to visual stimuli spatially mismatched from sounds on performance of an interleaved auditory-only saccade task. We found that when humans and monkeys were provided the visual stimulus asynchronously with the sound but as feedback to an auditory-guided saccade, they shifted their subsequent auditory-only performance toward the direction of the visual cue by 1.3-1.7 degrees, or 22-28% of the original 6 degree visual-auditory mismatch. In contrast when the visual stimulus was presented synchronously with the sound but extinguished too quickly to provide this feedback, there was little change in subsequent auditory-only performance. Our results suggest that the outcome of our own actions is vital to localizing sounds correctly. Contrary to previous expectations, visual calibration of auditory space does not appear to require visual-auditory associations based on synchrony/simultaneity.https://www.ncbi.nlm.nih.gov/pmc/articles/pmid/24009691/?tool=EBI
collection DOAJ
language English
format Article
sources DOAJ
author Daniel S Pages
Jennifer M Groh
spellingShingle Daniel S Pages
Jennifer M Groh
Looking at the ventriloquist: visual outcome of eye movements calibrates sound localization.
PLoS ONE
author_facet Daniel S Pages
Jennifer M Groh
author_sort Daniel S Pages
title Looking at the ventriloquist: visual outcome of eye movements calibrates sound localization.
title_short Looking at the ventriloquist: visual outcome of eye movements calibrates sound localization.
title_full Looking at the ventriloquist: visual outcome of eye movements calibrates sound localization.
title_fullStr Looking at the ventriloquist: visual outcome of eye movements calibrates sound localization.
title_full_unstemmed Looking at the ventriloquist: visual outcome of eye movements calibrates sound localization.
title_sort looking at the ventriloquist: visual outcome of eye movements calibrates sound localization.
publisher Public Library of Science (PLoS)
series PLoS ONE
issn 1932-6203
publishDate 2013-01-01
description A general problem in learning is how the brain determines what lesson to learn (and what lessons not to learn). For example, sound localization is a behavior that is partially learned with the aid of vision. This process requires correctly matching a visual location to that of a sound. This is an intrinsically circular problem when sound location is itself uncertain and the visual scene is rife with possible visual matches. Here, we develop a simple paradigm using visual guidance of sound localization to gain insight into how the brain confronts this type of circularity. We tested two competing hypotheses. 1: The brain guides sound location learning based on the synchrony or simultaneity of auditory-visual stimuli, potentially involving a Hebbian associative mechanism. 2: The brain uses a 'guess and check' heuristic in which visual feedback that is obtained after an eye movement to a sound alters future performance, perhaps by recruiting the brain's reward-related circuitry. We assessed the effects of exposure to visual stimuli spatially mismatched from sounds on performance of an interleaved auditory-only saccade task. We found that when humans and monkeys were provided the visual stimulus asynchronously with the sound but as feedback to an auditory-guided saccade, they shifted their subsequent auditory-only performance toward the direction of the visual cue by 1.3-1.7 degrees, or 22-28% of the original 6 degree visual-auditory mismatch. In contrast when the visual stimulus was presented synchronously with the sound but extinguished too quickly to provide this feedback, there was little change in subsequent auditory-only performance. Our results suggest that the outcome of our own actions is vital to localizing sounds correctly. Contrary to previous expectations, visual calibration of auditory space does not appear to require visual-auditory associations based on synchrony/simultaneity.
url https://www.ncbi.nlm.nih.gov/pmc/articles/pmid/24009691/?tool=EBI
work_keys_str_mv AT danielspages lookingattheventriloquistvisualoutcomeofeyemovementscalibratessoundlocalization
AT jennifermgroh lookingattheventriloquistvisualoutcomeofeyemovementscalibratessoundlocalization
_version_ 1714803105341636608