A comparative analysis of dynamic averaging techniques in federated learning
A dissertation submitted in fulfilment of the requirements for the degree Master of Science in the School of Computer Science and Applied Mathematics Faculty of Science, University of the Witwatersrand, 2020 === Due to the advancements in mobile technology and user privacy concerns, federated learn...
Main Author: | |
---|---|
Format: | Others |
Language: | en |
Published: |
2021
|
Online Access: | https://hdl.handle.net/10539/31059 |
id |
ndltd-netd.ac.za-oai-union.ndltd.org-wits-oai-wiredspace.wits.ac.za-10539-31059 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-netd.ac.za-oai-union.ndltd.org-wits-oai-wiredspace.wits.ac.za-10539-310592021-05-24T05:08:12Z A comparative analysis of dynamic averaging techniques in federated learning Reddy, Sashlin A dissertation submitted in fulfilment of the requirements for the degree Master of Science in the School of Computer Science and Applied Mathematics Faculty of Science, University of the Witwatersrand, 2020 Due to the advancements in mobile technology and user privacy concerns, federated learning has emerged as a popular machine learning (ML) method to push training of statistical models to the edge. Federated learning involves training a shared model under the coordination of a centralized server from a federation of participating clients. In practice federated learning methods have to overcome large network delays and bandwidth limits. To overcome the communication bottlenecks, recent works propose methods to reduce the communication frequency that have negligible impact on model accuracy also defined as model performance. Naive methods reduce the number of communication rounds in order to reduce the communication frequency. However, it is possible to invest communication more efficiently through dynamic communication protocols. This is deemed as dynamic averaging. Few have addressed such protocols. More so, few works base this dynamic averaging protocol on the diversity of the data and the loss. In this work, we introduce dynamic averaging frameworks based on the diversity of the data as well as the loss encountered by each client. This overcomes the assumption that each client participates equally and addresses the properties of federated learning. Results show that the overall communication overhead is reduced with negligible decrease in accuracy CK2021 2021-04-30T15:40:53Z 2021-04-30T15:40:53Z 2020 Thesis https://hdl.handle.net/10539/31059 en application/pdf |
collection |
NDLTD |
language |
en |
format |
Others
|
sources |
NDLTD |
description |
A dissertation submitted in fulfilment of the requirements for the degree Master of Science in the School of Computer Science and Applied Mathematics Faculty of Science, University of the Witwatersrand, 2020 === Due to the advancements in mobile technology and user privacy concerns, federated learning has emerged as a popular machine learning (ML) method to push training of statistical models to the edge. Federated learning involves training a shared model under the coordination of a centralized server from a federation of participating clients. In practice federated learning methods have to overcome large network delays and bandwidth limits. To overcome the communication bottlenecks, recent works propose methods to reduce the communication frequency that have negligible impact on model accuracy also defined as model performance. Naive methods reduce the number of communication rounds in order to reduce the communication frequency. However, it is possible to invest communication more efficiently through dynamic communication protocols. This is deemed as dynamic averaging. Few have addressed such protocols. More so, few works base this dynamic averaging protocol on the diversity of the data and the loss. In this work, we introduce dynamic averaging frameworks based on the diversity of the data as well as the loss encountered by each client. This overcomes the assumption that each client participates equally and addresses the properties of federated learning. Results show that the overall communication overhead is reduced with negligible decrease in accuracy === CK2021 |
author |
Reddy, Sashlin |
spellingShingle |
Reddy, Sashlin A comparative analysis of dynamic averaging techniques in federated learning |
author_facet |
Reddy, Sashlin |
author_sort |
Reddy, Sashlin |
title |
A comparative analysis of dynamic averaging techniques in federated learning |
title_short |
A comparative analysis of dynamic averaging techniques in federated learning |
title_full |
A comparative analysis of dynamic averaging techniques in federated learning |
title_fullStr |
A comparative analysis of dynamic averaging techniques in federated learning |
title_full_unstemmed |
A comparative analysis of dynamic averaging techniques in federated learning |
title_sort |
comparative analysis of dynamic averaging techniques in federated learning |
publishDate |
2021 |
url |
https://hdl.handle.net/10539/31059 |
work_keys_str_mv |
AT reddysashlin acomparativeanalysisofdynamicaveragingtechniquesinfederatedlearning AT reddysashlin comparativeanalysisofdynamicaveragingtechniquesinfederatedlearning |
_version_ |
1719405794180464640 |