Enhanced storage capacity with errors in scale-free Hopfield neural networks: An analytical study.

The Hopfield model is a pioneering neural network model with associative memory retrieval. The analytical solution of the model in mean field limit revealed that memories can be retrieved without any error up to a finite storage capacity of O(N), where N is the system size. Beyond the threshold, the...

Full description

Bibliographic Details
Main Authors: Do-Hyun Kim, Jinha Park, Byungnam Kahng
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2017-01-01
Series:PLoS ONE
Online Access:http://europepmc.org/articles/PMC5659639?pdf=render
id doaj-ca4f0857eefc45e7adb7e6111a457a6a
record_format Article
spelling doaj-ca4f0857eefc45e7adb7e6111a457a6a2020-11-24T21:30:55ZengPublic Library of Science (PLoS)PLoS ONE1932-62032017-01-011210e018468310.1371/journal.pone.0184683Enhanced storage capacity with errors in scale-free Hopfield neural networks: An analytical study.Do-Hyun KimJinha ParkByungnam KahngThe Hopfield model is a pioneering neural network model with associative memory retrieval. The analytical solution of the model in mean field limit revealed that memories can be retrieved without any error up to a finite storage capacity of O(N), where N is the system size. Beyond the threshold, they are completely lost. Since the introduction of the Hopfield model, the theory of neural networks has been further developed toward realistic neural networks using analog neurons, spiking neurons, etc. Nevertheless, those advances are based on fully connected networks, which are inconsistent with recent experimental discovery that the number of connections of each neuron seems to be heterogeneous, following a heavy-tailed distribution. Motivated by this observation, we consider the Hopfield model on scale-free networks and obtain a different pattern of associative memory retrieval from that obtained on the fully connected network: the storage capacity becomes tremendously enhanced but with some error in the memory retrieval, which appears as the heterogeneity of the connections is increased. Moreover, the error rates are also obtained on several real neural networks and are indeed similar to that on scale-free model networks.http://europepmc.org/articles/PMC5659639?pdf=render
collection DOAJ
language English
format Article
sources DOAJ
author Do-Hyun Kim
Jinha Park
Byungnam Kahng
spellingShingle Do-Hyun Kim
Jinha Park
Byungnam Kahng
Enhanced storage capacity with errors in scale-free Hopfield neural networks: An analytical study.
PLoS ONE
author_facet Do-Hyun Kim
Jinha Park
Byungnam Kahng
author_sort Do-Hyun Kim
title Enhanced storage capacity with errors in scale-free Hopfield neural networks: An analytical study.
title_short Enhanced storage capacity with errors in scale-free Hopfield neural networks: An analytical study.
title_full Enhanced storage capacity with errors in scale-free Hopfield neural networks: An analytical study.
title_fullStr Enhanced storage capacity with errors in scale-free Hopfield neural networks: An analytical study.
title_full_unstemmed Enhanced storage capacity with errors in scale-free Hopfield neural networks: An analytical study.
title_sort enhanced storage capacity with errors in scale-free hopfield neural networks: an analytical study.
publisher Public Library of Science (PLoS)
series PLoS ONE
issn 1932-6203
publishDate 2017-01-01
description The Hopfield model is a pioneering neural network model with associative memory retrieval. The analytical solution of the model in mean field limit revealed that memories can be retrieved without any error up to a finite storage capacity of O(N), where N is the system size. Beyond the threshold, they are completely lost. Since the introduction of the Hopfield model, the theory of neural networks has been further developed toward realistic neural networks using analog neurons, spiking neurons, etc. Nevertheless, those advances are based on fully connected networks, which are inconsistent with recent experimental discovery that the number of connections of each neuron seems to be heterogeneous, following a heavy-tailed distribution. Motivated by this observation, we consider the Hopfield model on scale-free networks and obtain a different pattern of associative memory retrieval from that obtained on the fully connected network: the storage capacity becomes tremendously enhanced but with some error in the memory retrieval, which appears as the heterogeneity of the connections is increased. Moreover, the error rates are also obtained on several real neural networks and are indeed similar to that on scale-free model networks.
url http://europepmc.org/articles/PMC5659639?pdf=render
work_keys_str_mv AT dohyunkim enhancedstoragecapacitywitherrorsinscalefreehopfieldneuralnetworksananalyticalstudy
AT jinhapark enhancedstoragecapacitywitherrorsinscalefreehopfieldneuralnetworksananalyticalstudy
AT byungnamkahng enhancedstoragecapacitywitherrorsinscalefreehopfieldneuralnetworksananalyticalstudy
_version_ 1725961007296151552