New Gradient-Weighted Adaptive Gradient Methods With Dynamic Constraints
Existing adaptive gradient descent optimization algorithms such as adaptive gradient (Adagrad), adaptive moment estimation (Adam), and root mean square prop (RMSprop), increase the convergence speed by dynamically adjusting the learning rate. However, in some application scenarios, the generalizatio...
Main Authors: | Dong Liang, Fanfan Ma, Wenyan Li |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2020-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9117128/ |
Similar Items
-
A study of gradient based particle swarm optimisers
by: Barla-Szabo, Daniel
Published: (2013) -
AG-SGD: Angle-Based Stochastic Gradient Descent
by: Chongya Song, et al.
Published: (2021-01-01) -
New conjugacy condition with pair-conjugate gradient methods for unconstrained optimization
by: Abbas Al-Bayati, et al.
Published: (2009-09-01) -
A Bounded Scheduling Method for Adaptive Gradient Methods
by: Mingxing Tang, et al.
Published: (2019-09-01) -
Tooth-Marked Tongue Recognition Using Gradient-Weighted Class Activation Maps
by: Yue Sun, et al.
Published: (2019-02-01)