Convergence and Robustness Analysis of the Exponential-Type Varying Gain Recurrent Neural Network for Solving Matrix-Type Linear Time-Varying Equation

To solve matrix-type linear time-varying equation more efficiently, a novel exponentialtype varying gain recurrent neural network (EVG-RNN) is proposed in this paper. Being distinguished from the traditional fixed-parameter gain recurrent neural network (FG-RNN), the proposed EVG-RNN is derived from...

Full description

Bibliographic Details
Main Authors: Zhijun Zhang, Zheng Fu, Lunan Zheng, Min Gan
Format: Article
Language:English
Published: IEEE 2018-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8481681/
Description
Summary:To solve matrix-type linear time-varying equation more efficiently, a novel exponentialtype varying gain recurrent neural network (EVG-RNN) is proposed in this paper. Being distinguished from the traditional fixed-parameter gain recurrent neural network (FG-RNN), the proposed EVG-RNN is derived from a vectoror matrix-based unbounded error function by a varying-parameter neural dynamic approach. With four different kinds of activation functions, the super-exponential convergence performance of EVG-RNN is proved theoretically in details, of which the error convergence rate is much faster than that of FG-RNN. In addition, mathematics proves that the computation errors of EVG-RNN can converge to zero, and it possesses the capability of restraining external interference. Finally, series of computer simulations verify and illustrate the better performance of convergence and robustness of EVG-RNN than that of FG-RNN and FTZNN when solving the identical linear time-varying equation.
ISSN:2169-3536