Neurons as Canonical Correlation Analyzers

Normative models of neural computation offer simplified yet lucid mathematical descriptions of murky biological phenomena. Previously, online Principal Component Analysis (PCA) was used to model a network of single-compartment neurons accounting for weighted summation of upstream neural activity in...

Full description

Bibliographic Details
Main Authors: Cengiz Pehlevan, Xinyuan Zhao, Anirvan M. Sengupta, Dmitri Chklovskii
Format: Article
Language:English
Published: Frontiers Media S.A. 2020-06-01
Series:Frontiers in Computational Neuroscience
Subjects:
Online Access:https://www.frontiersin.org/article/10.3389/fncom.2020.00055/full
id doaj-1036feadbbd048c0a244a6119ad8b3e0
record_format Article
spelling doaj-1036feadbbd048c0a244a6119ad8b3e02020-11-25T03:17:30ZengFrontiers Media S.A.Frontiers in Computational Neuroscience1662-51882020-06-011410.3389/fncom.2020.00055522429Neurons as Canonical Correlation AnalyzersCengiz Pehlevan0Xinyuan Zhao1Anirvan M. Sengupta2Dmitri Chklovskii3Dmitri Chklovskii4John A. Paulson School of Engineering and Applied Sciences and Center for Brain Science, Harvard University, Cambridge, MA, United StatesCenter for Neural Science, New York University, New York, NY, United StatesRutgers, The State University of New Jersey, New Brunswick, NJ, United StatesCenter for Computational Biology, Flatiron Institute, New York, NY, United StatesLangone Medical Center, New York University, New York, NY, United StatesNormative models of neural computation offer simplified yet lucid mathematical descriptions of murky biological phenomena. Previously, online Principal Component Analysis (PCA) was used to model a network of single-compartment neurons accounting for weighted summation of upstream neural activity in the soma and Hebbian/anti-Hebbian synaptic learning rules. However, synaptic plasticity in biological neurons often depends on the integration of synaptic currents over a dendritic compartment rather than total current in the soma. Motivated by this observation, we model a pyramidal neuronal network using online Canonical Correlation Analysis (CCA). Given two related datasets represented by distal and proximal dendritic inputs, CCA projects them onto the subspace which maximizes the correlation between their projections. First, adopting a normative approach and starting from a single-channel CCA objective function, we derive an online gradient-based optimization algorithm whose steps can be interpreted as the operation of a pyramidal neuron. To model networks of pyramidal neurons, we introduce a novel multi-channel CCA objective function, and derive from it an online gradient-based optimization algorithm whose steps can be interpreted as the operation of a pyramidal neuron network including its architecture, dynamics, and synaptic learning rules. Next, we model a neuron with more than two dendritic compartments by deriving its operation from a known objective function for multi-view CCA. Finally, we confirm the functionality of our networks via numerical simulations. Overall, our work presents a simplified but informative abstraction of learning in a pyramidal neuron network, and demonstrates how such networks can integrate multiple sources of inputs.https://www.frontiersin.org/article/10.3389/fncom.2020.00055/fullneural networksCanonical Correlation Analysis (CCA)Hebbian plasticitypyramidal neuronbiologically plausible learning
collection DOAJ
language English
format Article
sources DOAJ
author Cengiz Pehlevan
Xinyuan Zhao
Anirvan M. Sengupta
Dmitri Chklovskii
Dmitri Chklovskii
spellingShingle Cengiz Pehlevan
Xinyuan Zhao
Anirvan M. Sengupta
Dmitri Chklovskii
Dmitri Chklovskii
Neurons as Canonical Correlation Analyzers
Frontiers in Computational Neuroscience
neural networks
Canonical Correlation Analysis (CCA)
Hebbian plasticity
pyramidal neuron
biologically plausible learning
author_facet Cengiz Pehlevan
Xinyuan Zhao
Anirvan M. Sengupta
Dmitri Chklovskii
Dmitri Chklovskii
author_sort Cengiz Pehlevan
title Neurons as Canonical Correlation Analyzers
title_short Neurons as Canonical Correlation Analyzers
title_full Neurons as Canonical Correlation Analyzers
title_fullStr Neurons as Canonical Correlation Analyzers
title_full_unstemmed Neurons as Canonical Correlation Analyzers
title_sort neurons as canonical correlation analyzers
publisher Frontiers Media S.A.
series Frontiers in Computational Neuroscience
issn 1662-5188
publishDate 2020-06-01
description Normative models of neural computation offer simplified yet lucid mathematical descriptions of murky biological phenomena. Previously, online Principal Component Analysis (PCA) was used to model a network of single-compartment neurons accounting for weighted summation of upstream neural activity in the soma and Hebbian/anti-Hebbian synaptic learning rules. However, synaptic plasticity in biological neurons often depends on the integration of synaptic currents over a dendritic compartment rather than total current in the soma. Motivated by this observation, we model a pyramidal neuronal network using online Canonical Correlation Analysis (CCA). Given two related datasets represented by distal and proximal dendritic inputs, CCA projects them onto the subspace which maximizes the correlation between their projections. First, adopting a normative approach and starting from a single-channel CCA objective function, we derive an online gradient-based optimization algorithm whose steps can be interpreted as the operation of a pyramidal neuron. To model networks of pyramidal neurons, we introduce a novel multi-channel CCA objective function, and derive from it an online gradient-based optimization algorithm whose steps can be interpreted as the operation of a pyramidal neuron network including its architecture, dynamics, and synaptic learning rules. Next, we model a neuron with more than two dendritic compartments by deriving its operation from a known objective function for multi-view CCA. Finally, we confirm the functionality of our networks via numerical simulations. Overall, our work presents a simplified but informative abstraction of learning in a pyramidal neuron network, and demonstrates how such networks can integrate multiple sources of inputs.
topic neural networks
Canonical Correlation Analysis (CCA)
Hebbian plasticity
pyramidal neuron
biologically plausible learning
url https://www.frontiersin.org/article/10.3389/fncom.2020.00055/full
work_keys_str_mv AT cengizpehlevan neuronsascanonicalcorrelationanalyzers
AT xinyuanzhao neuronsascanonicalcorrelationanalyzers
AT anirvanmsengupta neuronsascanonicalcorrelationanalyzers
AT dmitrichklovskii neuronsascanonicalcorrelationanalyzers
AT dmitrichklovskii neuronsascanonicalcorrelationanalyzers
_version_ 1724631892602388480