DisSAGD: A Distributed Parameter Update Scheme Based on Variance Reduction
Machine learning models often converge slowly and are unstable due to the significant variance of random data when using a sample estimate gradient in SGD. To increase the speed of convergence and improve stability, a distributed SGD algorithm based on variance reduction, named DisSAGD, is proposed...
Main Authors: | Haijie Pan, Lirong Zheng |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2021-07-01
|
Series: | Sensors |
Subjects: | |
Online Access: | https://www.mdpi.com/1424-8220/21/15/5124 |
Similar Items
-
Analysis of the Variance Reduction in SVRG and a New Acceleration Method
by: Erxue Min, et al.
Published: (2018-01-01) -
Optimization for Supervised Machine Learning: Randomized Algorithms for Data and Parameters
by: Hanzely, Filip
Published: (2020) -
Refined Mode-Clustering via the Gradient of Slope
by: Kunhui Zhang, et al.
Published: (2021-06-01) -
Analysis of inconsistent source sampling in monte carlo weight-window variance reduction methods
by: David P. Griesheimer, et al.
Published: (2017-09-01) -
Accelerating the Training Process of Convolutional Neural Networks for Image Classification by Dropping Training Samples Out
by: Naisen Yang, et al.
Published: (2020-01-01)