Transferring learning from external to internal weights in echo-state networks with sparse connectivity.

Modifying weights within a recurrent network to improve performance on a task has proven to be difficult. Echo-state networks in which modification is restricted to the weights of connections onto network outputs provide an easier alternative, but at the expense of modifying the typically sparse arc...

Full description

Bibliographic Details
Main Authors: David Sussillo, L F Abbott
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2012-01-01
Series:PLoS ONE
Online Access:http://europepmc.org/articles/PMC3360031?pdf=render
id doaj-89e03f64ad754ea3b689905ec8b5d8e9
record_format Article
spelling doaj-89e03f64ad754ea3b689905ec8b5d8e92020-11-25T02:15:26ZengPublic Library of Science (PLoS)PLoS ONE1932-62032012-01-0175e3737210.1371/journal.pone.0037372Transferring learning from external to internal weights in echo-state networks with sparse connectivity.David SussilloL F AbbottModifying weights within a recurrent network to improve performance on a task has proven to be difficult. Echo-state networks in which modification is restricted to the weights of connections onto network outputs provide an easier alternative, but at the expense of modifying the typically sparse architecture of the network by including feedback from the output back into the network. We derive methods for using the values of the output weights from a trained echo-state network to set recurrent weights within the network. The result of this "transfer of learning" is a recurrent network that performs the task without requiring the output feedback present in the original network. We also discuss a hybrid version in which online learning is applied to both output and recurrent weights. Both approaches provide efficient ways of training recurrent networks to perform complex tasks. Through an analysis of the conditions required to make transfer of learning work, we define the concept of a "self-sensing" network state, and we compare and contrast this with compressed sensing.http://europepmc.org/articles/PMC3360031?pdf=render
collection DOAJ
language English
format Article
sources DOAJ
author David Sussillo
L F Abbott
spellingShingle David Sussillo
L F Abbott
Transferring learning from external to internal weights in echo-state networks with sparse connectivity.
PLoS ONE
author_facet David Sussillo
L F Abbott
author_sort David Sussillo
title Transferring learning from external to internal weights in echo-state networks with sparse connectivity.
title_short Transferring learning from external to internal weights in echo-state networks with sparse connectivity.
title_full Transferring learning from external to internal weights in echo-state networks with sparse connectivity.
title_fullStr Transferring learning from external to internal weights in echo-state networks with sparse connectivity.
title_full_unstemmed Transferring learning from external to internal weights in echo-state networks with sparse connectivity.
title_sort transferring learning from external to internal weights in echo-state networks with sparse connectivity.
publisher Public Library of Science (PLoS)
series PLoS ONE
issn 1932-6203
publishDate 2012-01-01
description Modifying weights within a recurrent network to improve performance on a task has proven to be difficult. Echo-state networks in which modification is restricted to the weights of connections onto network outputs provide an easier alternative, but at the expense of modifying the typically sparse architecture of the network by including feedback from the output back into the network. We derive methods for using the values of the output weights from a trained echo-state network to set recurrent weights within the network. The result of this "transfer of learning" is a recurrent network that performs the task without requiring the output feedback present in the original network. We also discuss a hybrid version in which online learning is applied to both output and recurrent weights. Both approaches provide efficient ways of training recurrent networks to perform complex tasks. Through an analysis of the conditions required to make transfer of learning work, we define the concept of a "self-sensing" network state, and we compare and contrast this with compressed sensing.
url http://europepmc.org/articles/PMC3360031?pdf=render
work_keys_str_mv AT davidsussillo transferringlearningfromexternaltointernalweightsinechostatenetworkswithsparseconnectivity
AT lfabbott transferringlearningfromexternaltointernalweightsinechostatenetworkswithsparseconnectivity
_version_ 1724896493696974848