Weight statistics controls dynamics in recurrent neural networks.

Recurrent neural networks are complex non-linear systems, capable of ongoing activity in the absence of driving inputs. The dynamical properties of these systems, in particular their long-time attractor states, are determined on the microscopic level by the connection strengths wij between the indiv...

Full description

Bibliographic Details
Main Authors: Patrick Krauss, Marc Schuster, Verena Dietrich, Achim Schilling, Holger Schulze, Claus Metzner
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2019-01-01
Series:PLoS ONE
Online Access:https://doi.org/10.1371/journal.pone.0214541
id doaj-fd5f389a92eb43d3b5f1d8fd7dafac71
record_format Article
spelling doaj-fd5f389a92eb43d3b5f1d8fd7dafac712021-03-03T20:45:20ZengPublic Library of Science (PLoS)PLoS ONE1932-62032019-01-01144e021454110.1371/journal.pone.0214541Weight statistics controls dynamics in recurrent neural networks.Patrick KraussMarc SchusterVerena DietrichAchim SchillingHolger SchulzeClaus MetznerRecurrent neural networks are complex non-linear systems, capable of ongoing activity in the absence of driving inputs. The dynamical properties of these systems, in particular their long-time attractor states, are determined on the microscopic level by the connection strengths wij between the individual neurons. However, little is known to which extent network dynamics is tunable on a more coarse-grained level by the statistical features of the weight matrix. In this work, we investigate the dynamics of recurrent networks of Boltzmann neurons. In particular we study the impact of three statistical parameters: density (the fraction of non-zero connections), balance (the ratio of excitatory to inhibitory connections), and symmetry (the fraction of neuron pairs with wij = wji). By computing a 'phase diagram' of network dynamics, we find that balance is the essential control parameter: Its gradual increase from negative to positive values drives the system from oscillatory behavior into a chaotic regime, and eventually into stationary fixed points. Only directly at the border of the chaotic regime do the neural networks display rich but regular dynamics, thus enabling actual information processing. These results suggest that the brain, too, is fine-tuned to the 'edge of chaos' by assuring a proper balance between excitatory and inhibitory neural connections.https://doi.org/10.1371/journal.pone.0214541
collection DOAJ
language English
format Article
sources DOAJ
author Patrick Krauss
Marc Schuster
Verena Dietrich
Achim Schilling
Holger Schulze
Claus Metzner
spellingShingle Patrick Krauss
Marc Schuster
Verena Dietrich
Achim Schilling
Holger Schulze
Claus Metzner
Weight statistics controls dynamics in recurrent neural networks.
PLoS ONE
author_facet Patrick Krauss
Marc Schuster
Verena Dietrich
Achim Schilling
Holger Schulze
Claus Metzner
author_sort Patrick Krauss
title Weight statistics controls dynamics in recurrent neural networks.
title_short Weight statistics controls dynamics in recurrent neural networks.
title_full Weight statistics controls dynamics in recurrent neural networks.
title_fullStr Weight statistics controls dynamics in recurrent neural networks.
title_full_unstemmed Weight statistics controls dynamics in recurrent neural networks.
title_sort weight statistics controls dynamics in recurrent neural networks.
publisher Public Library of Science (PLoS)
series PLoS ONE
issn 1932-6203
publishDate 2019-01-01
description Recurrent neural networks are complex non-linear systems, capable of ongoing activity in the absence of driving inputs. The dynamical properties of these systems, in particular their long-time attractor states, are determined on the microscopic level by the connection strengths wij between the individual neurons. However, little is known to which extent network dynamics is tunable on a more coarse-grained level by the statistical features of the weight matrix. In this work, we investigate the dynamics of recurrent networks of Boltzmann neurons. In particular we study the impact of three statistical parameters: density (the fraction of non-zero connections), balance (the ratio of excitatory to inhibitory connections), and symmetry (the fraction of neuron pairs with wij = wji). By computing a 'phase diagram' of network dynamics, we find that balance is the essential control parameter: Its gradual increase from negative to positive values drives the system from oscillatory behavior into a chaotic regime, and eventually into stationary fixed points. Only directly at the border of the chaotic regime do the neural networks display rich but regular dynamics, thus enabling actual information processing. These results suggest that the brain, too, is fine-tuned to the 'edge of chaos' by assuring a proper balance between excitatory and inhibitory neural connections.
url https://doi.org/10.1371/journal.pone.0214541
work_keys_str_mv AT patrickkrauss weightstatisticscontrolsdynamicsinrecurrentneuralnetworks
AT marcschuster weightstatisticscontrolsdynamicsinrecurrentneuralnetworks
AT verenadietrich weightstatisticscontrolsdynamicsinrecurrentneuralnetworks
AT achimschilling weightstatisticscontrolsdynamicsinrecurrentneuralnetworks
AT holgerschulze weightstatisticscontrolsdynamicsinrecurrentneuralnetworks
AT clausmetzner weightstatisticscontrolsdynamicsinrecurrentneuralnetworks
_version_ 1714820844412207104