Convergence of Batch Split-Complex Backpropagation Algorithm for Complex-Valued Neural Networks
The batch split-complex backpropagation (BSCBP) algorithm for training complex-valued neural networks is considered. For constant learning rate, it is proved that the error function of BSCBP algorithm is monotone during the training iteration process, and the gradient of the error function tends to...
Main Authors: | Huisheng Zhang, Chao Zhang, Wei Wu |
---|---|
Format: | Article |
Language: | English |
Published: |
Hindawi Limited
2009-01-01
|
Series: | Discrete Dynamics in Nature and Society |
Online Access: | http://dx.doi.org/10.1155/2009/329173 |
Similar Items
-
Convergence of an Online Split-Complex Gradient Algorithm for Complex-Valued Neural Networks
by: Huisheng Zhang, et al.
Published: (2010-01-01) -
The study of convergency analysis for backpropagation neural network
by: Mong-Tao-Tsai, et al.
Published: (2005) -
The Net-Input Error Backpropagation and Neuron Splitting for Neural Network
by: YU YA-TING, et al.
Published: (2003) -
Backpropagation algorithms and Reservoir Computing in Recurrent Neural Networks for the forecasting of complex spatiotemporal dynamics
by: Vlachas, P.R, et al.
Published: (2020) -
Face Recognition Using Complex Valued Backpropagation
by: Zumrotun Nafisah, et al.
Published: (2018-06-01)