Convergence of Batch Split-Complex Backpropagation Algorithm for Complex-Valued Neural Networks

The batch split-complex backpropagation (BSCBP) algorithm for training complex-valued neural networks is considered. For constant learning rate, it is proved that the error function of BSCBP algorithm is monotone during the training iteration process, and the gradient of the error function tends to...

Full description

Bibliographic Details
Main Authors: Huisheng Zhang, Chao Zhang, Wei Wu
Format: Article
Language:English
Published: Hindawi Limited 2009-01-01
Series:Discrete Dynamics in Nature and Society
Online Access:http://dx.doi.org/10.1155/2009/329173
id doaj-e61ae57dded743dda74e0ca4eb98eb0f
record_format Article
spelling doaj-e61ae57dded743dda74e0ca4eb98eb0f2020-11-25T00:10:14ZengHindawi LimitedDiscrete Dynamics in Nature and Society1026-02261607-887X2009-01-01200910.1155/2009/329173329173Convergence of Batch Split-Complex Backpropagation Algorithm for Complex-Valued Neural NetworksHuisheng Zhang0Chao Zhang1Wei Wu2Applied Mathematics Department, Dalian University of Technology, Dalian 116024, ChinaApplied Mathematics Department, Dalian University of Technology, Dalian 116024, ChinaApplied Mathematics Department, Dalian University of Technology, Dalian 116024, ChinaThe batch split-complex backpropagation (BSCBP) algorithm for training complex-valued neural networks is considered. For constant learning rate, it is proved that the error function of BSCBP algorithm is monotone during the training iteration process, and the gradient of the error function tends to zero. By adding a moderate condition, the weights sequence itself is also proved to be convergent. A numerical example is given to support the theoretical analysis.http://dx.doi.org/10.1155/2009/329173
collection DOAJ
language English
format Article
sources DOAJ
author Huisheng Zhang
Chao Zhang
Wei Wu
spellingShingle Huisheng Zhang
Chao Zhang
Wei Wu
Convergence of Batch Split-Complex Backpropagation Algorithm for Complex-Valued Neural Networks
Discrete Dynamics in Nature and Society
author_facet Huisheng Zhang
Chao Zhang
Wei Wu
author_sort Huisheng Zhang
title Convergence of Batch Split-Complex Backpropagation Algorithm for Complex-Valued Neural Networks
title_short Convergence of Batch Split-Complex Backpropagation Algorithm for Complex-Valued Neural Networks
title_full Convergence of Batch Split-Complex Backpropagation Algorithm for Complex-Valued Neural Networks
title_fullStr Convergence of Batch Split-Complex Backpropagation Algorithm for Complex-Valued Neural Networks
title_full_unstemmed Convergence of Batch Split-Complex Backpropagation Algorithm for Complex-Valued Neural Networks
title_sort convergence of batch split-complex backpropagation algorithm for complex-valued neural networks
publisher Hindawi Limited
series Discrete Dynamics in Nature and Society
issn 1026-0226
1607-887X
publishDate 2009-01-01
description The batch split-complex backpropagation (BSCBP) algorithm for training complex-valued neural networks is considered. For constant learning rate, it is proved that the error function of BSCBP algorithm is monotone during the training iteration process, and the gradient of the error function tends to zero. By adding a moderate condition, the weights sequence itself is also proved to be convergent. A numerical example is given to support the theoretical analysis.
url http://dx.doi.org/10.1155/2009/329173
work_keys_str_mv AT huishengzhang convergenceofbatchsplitcomplexbackpropagationalgorithmforcomplexvaluedneuralnetworks
AT chaozhang convergenceofbatchsplitcomplexbackpropagationalgorithmforcomplexvaluedneuralnetworks
AT weiwu convergenceofbatchsplitcomplexbackpropagationalgorithmforcomplexvaluedneuralnetworks
_version_ 1725408778698883072