DisSAGD: A Distributed Parameter Update Scheme Based on Variance Reduction

Machine learning models often converge slowly and are unstable due to the significant variance of random data when using a sample estimate gradient in SGD. To increase the speed of convergence and improve stability, a distributed SGD algorithm based on variance reduction, named DisSAGD, is proposed...

Full description

Bibliographic Details
Main Authors: Haijie Pan, Lirong Zheng
Format: Article
Language:English
Published: MDPI AG 2021-07-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/21/15/5124

Similar Items