On the correlation between reservoir metrics and performance for time series classification under the influence of synaptic plasticity.

Reservoir computing provides a simpler paradigm of training recurrent networks by initialising and adapting the recurrent connections separately to a supervised linear readout. This creates a problem, though. As the recurrent weights and topology are now separated from adapting to the task, there is...

Full description

Bibliographic Details
Main Authors: Joseph Chrol-Cannon, Yaochu Jin
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2014-01-01
Series:PLoS ONE
Online Access:http://europepmc.org/articles/PMC4092026?pdf=render
id doaj-831409f7e68c4ae498047cce7445357f
record_format Article
spelling doaj-831409f7e68c4ae498047cce7445357f2020-11-25T01:27:43ZengPublic Library of Science (PLoS)PLoS ONE1932-62032014-01-0197e10179210.1371/journal.pone.0101792On the correlation between reservoir metrics and performance for time series classification under the influence of synaptic plasticity.Joseph Chrol-CannonYaochu JinReservoir computing provides a simpler paradigm of training recurrent networks by initialising and adapting the recurrent connections separately to a supervised linear readout. This creates a problem, though. As the recurrent weights and topology are now separated from adapting to the task, there is a burden on the reservoir designer to construct an effective network that happens to produce state vectors that can be mapped linearly into the desired outputs. Guidance in forming a reservoir can be through the use of some established metrics which link a number of theoretical properties of the reservoir computing paradigm to quantitative measures that can be used to evaluate the effectiveness of a given design. We provide a comprehensive empirical study of four metrics; class separation, kernel quality, Lyapunov's exponent and spectral radius. These metrics are each compared over a number of repeated runs, for different reservoir computing set-ups that include three types of network topology and three mechanisms of weight adaptation through synaptic plasticity. Each combination of these methods is tested on two time-series classification problems. We find that the two metrics that correlate most strongly with the classification performance are Lyapunov's exponent and kernel quality. It is also evident in the comparisons that these two metrics both measure a similar property of the reservoir dynamics. We also find that class separation and spectral radius are both less reliable and less effective in predicting performance.http://europepmc.org/articles/PMC4092026?pdf=render
collection DOAJ
language English
format Article
sources DOAJ
author Joseph Chrol-Cannon
Yaochu Jin
spellingShingle Joseph Chrol-Cannon
Yaochu Jin
On the correlation between reservoir metrics and performance for time series classification under the influence of synaptic plasticity.
PLoS ONE
author_facet Joseph Chrol-Cannon
Yaochu Jin
author_sort Joseph Chrol-Cannon
title On the correlation between reservoir metrics and performance for time series classification under the influence of synaptic plasticity.
title_short On the correlation between reservoir metrics and performance for time series classification under the influence of synaptic plasticity.
title_full On the correlation between reservoir metrics and performance for time series classification under the influence of synaptic plasticity.
title_fullStr On the correlation between reservoir metrics and performance for time series classification under the influence of synaptic plasticity.
title_full_unstemmed On the correlation between reservoir metrics and performance for time series classification under the influence of synaptic plasticity.
title_sort on the correlation between reservoir metrics and performance for time series classification under the influence of synaptic plasticity.
publisher Public Library of Science (PLoS)
series PLoS ONE
issn 1932-6203
publishDate 2014-01-01
description Reservoir computing provides a simpler paradigm of training recurrent networks by initialising and adapting the recurrent connections separately to a supervised linear readout. This creates a problem, though. As the recurrent weights and topology are now separated from adapting to the task, there is a burden on the reservoir designer to construct an effective network that happens to produce state vectors that can be mapped linearly into the desired outputs. Guidance in forming a reservoir can be through the use of some established metrics which link a number of theoretical properties of the reservoir computing paradigm to quantitative measures that can be used to evaluate the effectiveness of a given design. We provide a comprehensive empirical study of four metrics; class separation, kernel quality, Lyapunov's exponent and spectral radius. These metrics are each compared over a number of repeated runs, for different reservoir computing set-ups that include three types of network topology and three mechanisms of weight adaptation through synaptic plasticity. Each combination of these methods is tested on two time-series classification problems. We find that the two metrics that correlate most strongly with the classification performance are Lyapunov's exponent and kernel quality. It is also evident in the comparisons that these two metrics both measure a similar property of the reservoir dynamics. We also find that class separation and spectral radius are both less reliable and less effective in predicting performance.
url http://europepmc.org/articles/PMC4092026?pdf=render
work_keys_str_mv AT josephchrolcannon onthecorrelationbetweenreservoirmetricsandperformancefortimeseriesclassificationundertheinfluenceofsynapticplasticity
AT yaochujin onthecorrelationbetweenreservoirmetricsandperformancefortimeseriesclassificationundertheinfluenceofsynapticplasticity
_version_ 1725103689825255424