Synchronization and Redundancy: Implications for Robustness of Neural Learning and Decision Making

Learning and decision making in the brain are key processes critical to survival, and yet are processes implemented by nonideal biological building blocks that can impose significant error. We explore quantitatively how the brain might cope with this inherent source of error by taking advantage of t...

Full description

Bibliographic Details
Main Authors: Slotine, Jean-Jacques E. (Contributor), Bouvrie, Jacob Vincent (Author)
Other Authors: Massachusetts Institute of Technology. Department of Mechanical Engineering (Contributor), Massachusetts Institute of Technology. Nonlinear Systems Laboratory (Contributor)
Format: Article
Language:English
Published: MIT Press, 2012-01-20T20:24:49Z.
Subjects:
Online Access:Get fulltext
LEADER 02544 am a22002413u 4500
001 68625
042 |a dc 
100 1 0 |a Slotine, Jean-Jacques E.  |e author 
100 1 0 |a Massachusetts Institute of Technology. Department of Mechanical Engineering  |e contributor 
100 1 0 |a Massachusetts Institute of Technology. Nonlinear Systems Laboratory  |e contributor 
100 1 0 |a Slotine, Jean-Jacques E.  |e contributor 
100 1 0 |a Slotine, Jean-Jacques E.  |e contributor 
700 1 0 |a Bouvrie, Jacob Vincent  |e author 
245 0 0 |a Synchronization and Redundancy: Implications for Robustness of Neural Learning and Decision Making 
260 |b MIT Press,   |c 2012-01-20T20:24:49Z. 
856 |z Get fulltext  |u http://hdl.handle.net/1721.1/68625 
520 |a Learning and decision making in the brain are key processes critical to survival, and yet are processes implemented by nonideal biological building blocks that can impose significant error. We explore quantitatively how the brain might cope with this inherent source of error by taking advantage of two ubiquitous mechanisms, redundancy and synchronization. In particular we consider a neural process whose goal is to learn a decision function by implementing a nonlinear gradient dynamics. The dynamics, however, are assumed to be corrupted by perturbations modeling the error, which might be incurred due to limitations of the biology, intrinsic neuronal noise, and imperfect measurements. We show that error, and the associated uncertainty surrounding a learned solution, can be controlled in large part by trading off synchronization strength among multiple redundant neural systems against the noise amplitude. The impact of the coupling between such redundant systems is quantified by the spectrum of the network Laplacian, and we discuss the role of network topology in synchronization and in reducing the effect of noise. We discuss range of situations in which the mechanisms we model arise in brain science and draw attention to experimental evidence suggesting that cortical circuits capable of implementing the computations of interest here can be found on several scales. Finally, simulations comparing theoretical bounds to the relevant empirical quantities show that the theoretical estimates we derive can be tight. 
520 |a National Science Foundation (U.S.) (contract IIS-08-03293) 
520 |a United States. Office of Naval Research (contract N000140710625) 
520 |a Alfred P. Sloan Foundation (grant BR-4834) 
546 |a en_US 
655 7 |a Article 
773 |t Neural Computation