New Gradient-Weighted Adaptive Gradient Methods With Dynamic Constraints

Existing adaptive gradient descent optimization algorithms such as adaptive gradient (Adagrad), adaptive moment estimation (Adam), and root mean square prop (RMSprop), increase the convergence speed by dynamically adjusting the learning rate. However, in some application scenarios, the generalizatio...

Full description

Bibliographic Details
Main Authors: Dong Liang, Fanfan Ma, Wenyan Li
Format: Article
Language:English
Published: IEEE 2020-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9117128/

Similar Items