Fusing Swarm Intelligence and Self-Assembly for Optimizing Echo State Networks
Optimizing a neural network’s topology is a difficult problem for at least two reasons: the topology space is discrete, and the quality of any given topology must be assessed by assigning many different sets of weights to its connections. These two characteristics tend to cause very “rough.” objecti...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
Hindawi Limited
2015-01-01
|
Series: | Computational Intelligence and Neuroscience |
Online Access: | http://dx.doi.org/10.1155/2015/642429 |
id |
doaj-baa0ed5552c54ef49d78767fb85b3139 |
---|---|
record_format |
Article |
spelling |
doaj-baa0ed5552c54ef49d78767fb85b31392020-11-25T00:20:41ZengHindawi LimitedComputational Intelligence and Neuroscience1687-52651687-52732015-01-01201510.1155/2015/642429642429Fusing Swarm Intelligence and Self-Assembly for Optimizing Echo State NetworksCharles E. Martin0James A. Reggia1HRL Laboratories, LLC, 3011 Malibu Canyon Road, Malibu, CA 90265, USADepartment of Computer Science, University of Maryland, College Park, MD 20742, USAOptimizing a neural network’s topology is a difficult problem for at least two reasons: the topology space is discrete, and the quality of any given topology must be assessed by assigning many different sets of weights to its connections. These two characteristics tend to cause very “rough.” objective functions. Here we demonstrate how self-assembly (SA) and particle swarm optimization (PSO) can be integrated to provide a novel and effective means of concurrently optimizing a neural network’s weights and topology. Combining SA and PSO addresses two key challenges. First, it creates a more integrated representation of neural network weights and topology so that we have just a single, continuous search domain that permits “smoother” objective functions. Second, it extends the traditional focus of self-assembly, from the growth of predefined target structures, to functional self-assembly, in which growth is driven by optimality criteria defined in terms of the performance of emerging structures on predefined computational problems. Our model incorporates a new way of viewing PSO that involves a population of growing, interacting networks, as opposed to particles. The effectiveness of our method for optimizing echo state network weights and topologies is demonstrated through its performance on a number of challenging benchmark problems.http://dx.doi.org/10.1155/2015/642429 |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Charles E. Martin James A. Reggia |
spellingShingle |
Charles E. Martin James A. Reggia Fusing Swarm Intelligence and Self-Assembly for Optimizing Echo State Networks Computational Intelligence and Neuroscience |
author_facet |
Charles E. Martin James A. Reggia |
author_sort |
Charles E. Martin |
title |
Fusing Swarm Intelligence and Self-Assembly for Optimizing Echo State Networks |
title_short |
Fusing Swarm Intelligence and Self-Assembly for Optimizing Echo State Networks |
title_full |
Fusing Swarm Intelligence and Self-Assembly for Optimizing Echo State Networks |
title_fullStr |
Fusing Swarm Intelligence and Self-Assembly for Optimizing Echo State Networks |
title_full_unstemmed |
Fusing Swarm Intelligence and Self-Assembly for Optimizing Echo State Networks |
title_sort |
fusing swarm intelligence and self-assembly for optimizing echo state networks |
publisher |
Hindawi Limited |
series |
Computational Intelligence and Neuroscience |
issn |
1687-5265 1687-5273 |
publishDate |
2015-01-01 |
description |
Optimizing a neural network’s topology is a difficult problem for at least two reasons: the topology space is discrete, and the quality of any given topology must be assessed by assigning many different sets of weights to its connections. These two characteristics tend to cause very “rough.” objective functions. Here we demonstrate how self-assembly (SA) and particle swarm optimization (PSO) can be integrated to provide a novel and effective means of concurrently optimizing a neural network’s weights and topology. Combining SA and PSO addresses two key challenges. First, it creates a more integrated representation of neural network weights and topology so that we have just a single, continuous search domain that permits “smoother” objective functions. Second, it extends the traditional focus of self-assembly, from the growth of predefined target structures, to functional self-assembly, in which growth is driven by optimality criteria defined in terms of the performance of emerging structures on predefined computational problems. Our model incorporates a new way of viewing PSO that involves a population of growing, interacting networks, as opposed to particles. The effectiveness of our method for optimizing echo state network weights and topologies is demonstrated through its performance on a number of challenging benchmark problems. |
url |
http://dx.doi.org/10.1155/2015/642429 |
work_keys_str_mv |
AT charlesemartin fusingswarmintelligenceandselfassemblyforoptimizingechostatenetworks AT jamesareggia fusingswarmintelligenceandselfassemblyforoptimizingechostatenetworks |
_version_ |
1725365815813865472 |