Multisensory control of multimodal behavior: do the legs know what the tongue is doing?

Understanding of adaptive behavior requires the precisely controlled presentation of multisensory stimuli combined with simultaneous measurement of multiple behavioral modalities. Hence, we developed a virtual reality apparatus that allows for simultaneous measurement of reward checking, a commonly...

Full description

Bibliographic Details
Main Authors: Jesse D Cushman, Daniel B Aharoni, Bernard Willers, Pascal Ravassard, Ashley Kees, Cliff Vuong, Briana Popeney, Katsushi Arisaka, Mayank R Mehta
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2013-01-01
Series:PLoS ONE
Online Access:http://europepmc.org/articles/PMC3817119?pdf=render
Description
Summary:Understanding of adaptive behavior requires the precisely controlled presentation of multisensory stimuli combined with simultaneous measurement of multiple behavioral modalities. Hence, we developed a virtual reality apparatus that allows for simultaneous measurement of reward checking, a commonly used measure in associative learning paradigms, and navigational behavior, along with precisely controlled presentation of visual, auditory and reward stimuli. Rats performed a virtual spatial navigation task analogous to the Morris maze where only distal visual or auditory cues provided spatial information. Spatial navigation and reward checking maps showed experience-dependent learning and were in register for distal visual cues. However, they showed a dissociation, whereby distal auditory cues failed to support spatial navigation but did support spatially localized reward checking. These findings indicate that rats can navigate in virtual space with only distal visual cues, without significant vestibular or other sensory inputs. Furthermore, they reveal the simultaneous dissociation between two reward-driven behaviors.
ISSN:1932-6203