A state space approach for piecewise-linear recurrent neural networks for identifying computational dynamics from neural measurements.

The computational and cognitive properties of neural systems are often thought to be implemented in terms of their (stochastic) network dynamics. Hence, recovering the system dynamics from experimentally observed neuronal time series, like multiple single-unit recordings or neuroimaging data, is an...

Full description

Bibliographic Details
Main Author: Daniel Durstewitz
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2017-06-01
Series:PLoS Computational Biology
Online Access:http://europepmc.org/articles/PMC5456035?pdf=render
id doaj-16f0b224ab414986ad9048878b7f79fe
record_format Article
spelling doaj-16f0b224ab414986ad9048878b7f79fe2020-11-24T21:50:37ZengPublic Library of Science (PLoS)PLoS Computational Biology1553-734X1553-73582017-06-01136e100554210.1371/journal.pcbi.1005542A state space approach for piecewise-linear recurrent neural networks for identifying computational dynamics from neural measurements.Daniel DurstewitzThe computational and cognitive properties of neural systems are often thought to be implemented in terms of their (stochastic) network dynamics. Hence, recovering the system dynamics from experimentally observed neuronal time series, like multiple single-unit recordings or neuroimaging data, is an important step toward understanding its computations. Ideally, one would not only seek a (lower-dimensional) state space representation of the dynamics, but would wish to have access to its statistical properties and their generative equations for in-depth analysis. Recurrent neural networks (RNNs) are a computationally powerful and dynamically universal formal framework which has been extensively studied from both the computational and the dynamical systems perspective. Here we develop a semi-analytical maximum-likelihood estimation scheme for piecewise-linear RNNs (PLRNNs) within the statistical framework of state space models, which accounts for noise in both the underlying latent dynamics and the observation process. The Expectation-Maximization algorithm is used to infer the latent state distribution, through a global Laplace approximation, and the PLRNN parameters iteratively. After validating the procedure on toy examples, and using inference through particle filters for comparison, the approach is applied to multiple single-unit recordings from the rodent anterior cingulate cortex (ACC) obtained during performance of a classical working memory task, delayed alternation. Models estimated from kernel-smoothed spike time data were able to capture the essential computational dynamics underlying task performance, including stimulus-selective delay activity. The estimated models were rarely multi-stable, however, but rather were tuned to exhibit slow dynamics in the vicinity of a bifurcation point. In summary, the present work advances a semi-analytical (thus reasonably fast) maximum-likelihood estimation framework for PLRNNs that may enable to recover relevant aspects of the nonlinear dynamics underlying observed neuronal time series, and directly link these to computational properties.http://europepmc.org/articles/PMC5456035?pdf=render
collection DOAJ
language English
format Article
sources DOAJ
author Daniel Durstewitz
spellingShingle Daniel Durstewitz
A state space approach for piecewise-linear recurrent neural networks for identifying computational dynamics from neural measurements.
PLoS Computational Biology
author_facet Daniel Durstewitz
author_sort Daniel Durstewitz
title A state space approach for piecewise-linear recurrent neural networks for identifying computational dynamics from neural measurements.
title_short A state space approach for piecewise-linear recurrent neural networks for identifying computational dynamics from neural measurements.
title_full A state space approach for piecewise-linear recurrent neural networks for identifying computational dynamics from neural measurements.
title_fullStr A state space approach for piecewise-linear recurrent neural networks for identifying computational dynamics from neural measurements.
title_full_unstemmed A state space approach for piecewise-linear recurrent neural networks for identifying computational dynamics from neural measurements.
title_sort state space approach for piecewise-linear recurrent neural networks for identifying computational dynamics from neural measurements.
publisher Public Library of Science (PLoS)
series PLoS Computational Biology
issn 1553-734X
1553-7358
publishDate 2017-06-01
description The computational and cognitive properties of neural systems are often thought to be implemented in terms of their (stochastic) network dynamics. Hence, recovering the system dynamics from experimentally observed neuronal time series, like multiple single-unit recordings or neuroimaging data, is an important step toward understanding its computations. Ideally, one would not only seek a (lower-dimensional) state space representation of the dynamics, but would wish to have access to its statistical properties and their generative equations for in-depth analysis. Recurrent neural networks (RNNs) are a computationally powerful and dynamically universal formal framework which has been extensively studied from both the computational and the dynamical systems perspective. Here we develop a semi-analytical maximum-likelihood estimation scheme for piecewise-linear RNNs (PLRNNs) within the statistical framework of state space models, which accounts for noise in both the underlying latent dynamics and the observation process. The Expectation-Maximization algorithm is used to infer the latent state distribution, through a global Laplace approximation, and the PLRNN parameters iteratively. After validating the procedure on toy examples, and using inference through particle filters for comparison, the approach is applied to multiple single-unit recordings from the rodent anterior cingulate cortex (ACC) obtained during performance of a classical working memory task, delayed alternation. Models estimated from kernel-smoothed spike time data were able to capture the essential computational dynamics underlying task performance, including stimulus-selective delay activity. The estimated models were rarely multi-stable, however, but rather were tuned to exhibit slow dynamics in the vicinity of a bifurcation point. In summary, the present work advances a semi-analytical (thus reasonably fast) maximum-likelihood estimation framework for PLRNNs that may enable to recover relevant aspects of the nonlinear dynamics underlying observed neuronal time series, and directly link these to computational properties.
url http://europepmc.org/articles/PMC5456035?pdf=render
work_keys_str_mv AT danieldurstewitz astatespaceapproachforpiecewiselinearrecurrentneuralnetworksforidentifyingcomputationaldynamicsfromneuralmeasurements
AT danieldurstewitz statespaceapproachforpiecewiselinearrecurrentneuralnetworksforidentifyingcomputationaldynamicsfromneuralmeasurements
_version_ 1725882741306687488