Evolving CNN-LSTM Models for Time Series Prediction Using Enhanced Grey Wolf Optimizer

In this research, we propose an enhanced Grey Wolf Optimizer (GWO) for designing the evolving Convolutional Neural Network-Long Short-Term Memory (CNN-LSTM) networks for time series analysis. To overcome the probability of stagnation at local optima and a slow convergence rate of the classical GWO a...

Full description

Bibliographic Details
Main Authors: Hailun Xie, Li Zhang, Chee Peng Lim
Format: Article
Language:English
Published: IEEE 2020-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9186058/
id doaj-4142268bfb1e41d2bc3d32d1181cdc64
record_format Article
spelling doaj-4142268bfb1e41d2bc3d32d1181cdc642021-06-03T23:08:55ZengIEEEIEEE Access2169-35362020-01-01816151916154110.1109/ACCESS.2020.30215279186058Evolving CNN-LSTM Models for Time Series Prediction Using Enhanced Grey Wolf OptimizerHailun Xie0https://orcid.org/0000-0001-6356-002XLi Zhang1https://orcid.org/0000-0001-6674-692XChee Peng Lim2https://orcid.org/0000-0003-4191-9083Department of Computer and Information Sciences, Faculty of Engineering and Environment, Computational Intelligence Research Group, Northumbria University, Newcastle upon Tyne, U.KDepartment of Computer and Information Sciences, Faculty of Engineering and Environment, Computational Intelligence Research Group, Northumbria University, Newcastle upon Tyne, U.KInstitute for Intelligent Systems Research and Innovation, Deakin University, Melbourne, VIC, AustraliaIn this research, we propose an enhanced Grey Wolf Optimizer (GWO) for designing the evolving Convolutional Neural Network-Long Short-Term Memory (CNN-LSTM) networks for time series analysis. To overcome the probability of stagnation at local optima and a slow convergence rate of the classical GWO algorithm, the newly proposed variant incorporates four distinctive search mechanisms. They comprise a nonlinear exploration scheme for dynamic search territory adjustment, a chaotic leadership dispatching strategy among the dominant wolves, a rectified spiral local exploitation action, as well as probability distribution-based leader enhancement. The evolving CNN-LSTM models are subsequently devised using the proposed GWO variant, where the network topology and learning hyperparameters are optimized for time series prediction and classification tasks. Evaluated using a number of benchmark problems, the proposed GWO-optimized CNN-LSTM models produce statistically significant results over those from several classical search methods and advanced GWO and Particle Swarm Optimization variants. Comparing with the baseline methods, the CNN-LSTM networks devised by the proposed GWO variant offer better representational capacities to not only capture the vital feature interactions, but also encapsulate the sophisticated dependencies in complex temporal contexts for undertaking time-series tasks.https://ieeexplore.ieee.org/document/9186058/Evolutionary computationGrey Wolf optimizertime series predictiondeep neural network
collection DOAJ
language English
format Article
sources DOAJ
author Hailun Xie
Li Zhang
Chee Peng Lim
spellingShingle Hailun Xie
Li Zhang
Chee Peng Lim
Evolving CNN-LSTM Models for Time Series Prediction Using Enhanced Grey Wolf Optimizer
IEEE Access
Evolutionary computation
Grey Wolf optimizer
time series prediction
deep neural network
author_facet Hailun Xie
Li Zhang
Chee Peng Lim
author_sort Hailun Xie
title Evolving CNN-LSTM Models for Time Series Prediction Using Enhanced Grey Wolf Optimizer
title_short Evolving CNN-LSTM Models for Time Series Prediction Using Enhanced Grey Wolf Optimizer
title_full Evolving CNN-LSTM Models for Time Series Prediction Using Enhanced Grey Wolf Optimizer
title_fullStr Evolving CNN-LSTM Models for Time Series Prediction Using Enhanced Grey Wolf Optimizer
title_full_unstemmed Evolving CNN-LSTM Models for Time Series Prediction Using Enhanced Grey Wolf Optimizer
title_sort evolving cnn-lstm models for time series prediction using enhanced grey wolf optimizer
publisher IEEE
series IEEE Access
issn 2169-3536
publishDate 2020-01-01
description In this research, we propose an enhanced Grey Wolf Optimizer (GWO) for designing the evolving Convolutional Neural Network-Long Short-Term Memory (CNN-LSTM) networks for time series analysis. To overcome the probability of stagnation at local optima and a slow convergence rate of the classical GWO algorithm, the newly proposed variant incorporates four distinctive search mechanisms. They comprise a nonlinear exploration scheme for dynamic search territory adjustment, a chaotic leadership dispatching strategy among the dominant wolves, a rectified spiral local exploitation action, as well as probability distribution-based leader enhancement. The evolving CNN-LSTM models are subsequently devised using the proposed GWO variant, where the network topology and learning hyperparameters are optimized for time series prediction and classification tasks. Evaluated using a number of benchmark problems, the proposed GWO-optimized CNN-LSTM models produce statistically significant results over those from several classical search methods and advanced GWO and Particle Swarm Optimization variants. Comparing with the baseline methods, the CNN-LSTM networks devised by the proposed GWO variant offer better representational capacities to not only capture the vital feature interactions, but also encapsulate the sophisticated dependencies in complex temporal contexts for undertaking time-series tasks.
topic Evolutionary computation
Grey Wolf optimizer
time series prediction
deep neural network
url https://ieeexplore.ieee.org/document/9186058/
work_keys_str_mv AT hailunxie evolvingcnnlstmmodelsfortimeseriespredictionusingenhancedgreywolfoptimizer
AT lizhang evolvingcnnlstmmodelsfortimeseriespredictionusingenhancedgreywolfoptimizer
AT cheepenglim evolvingcnnlstmmodelsfortimeseriespredictionusingenhancedgreywolfoptimizer
_version_ 1721398549885747200