Communication Optimization Schemes for Accelerating Distributed Deep Learning Systems

In a distributed deep learning system, a parameter server and workers must communicate to exchange gradients and parameters, and the communication cost increases as the number of workers increases. This paper presents a communication data optimization scheme to mitigate the decrease in throughput du...

Full description

Bibliographic Details
Main Authors: Jaehwan Lee, Hyeonseong Choi, Hyeonwoo Jeong, Baekhyeon Noh, Ji Sun Shin
Format: Article
Language:English
Published: MDPI AG 2020-12-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/10/24/8846