Fast and Privacy-Preserving Federated Joint Estimator of Multi-sUGMs
Learning multiple related graphs from many distributed and privacy-required resources is an important and common task in neuroscience applications. Medical researchers can comprehensively investigate the diagnostic evidence and understand the cause of certain brain diseases via analyzing the commona...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2021-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9493203/ |
id |
doaj-da7cc0b0cb24437bbfaf4e3d955ec5f6 |
---|---|
record_format |
Article |
spelling |
doaj-da7cc0b0cb24437bbfaf4e3d955ec5f62021-07-29T23:00:28ZengIEEEIEEE Access2169-35362021-01-01910407910409210.1109/ACCESS.2021.30994009493203Fast and Privacy-Preserving Federated Joint Estimator of Multi-sUGMsXiao Tan0https://orcid.org/0000-0002-3874-9557Tianyi Ma1https://orcid.org/0000-0001-6057-9825Tongtong Su2School of Computer Science and Engineering, Southest University, Nanjing, ChinaSchool of Computer Science and Engineering, Southest University, Nanjing, ChinaSchool of Computer Science and Engineering, Southest University, Nanjing, ChinaLearning multiple related graphs from many distributed and privacy-required resources is an important and common task in neuroscience applications. Medical researchers can comprehensively investigate the diagnostic evidence and understand the cause of certain brain diseases via analyzing the commonalities and differences of the brain connectomes predicted from the fMRI data across multiple hospitals. Previous sparse Undirected Graphical Model (sUGM) methods either cannot take full usage of the heterogeneous data while preserving privacy or miss the capability of handling the nonparanormal data, which is highly non-independent and identically distributed (non-i.i.d.). This paper proposes a novel and efficient approach, FEDJEM (<underline>fed</underline>erated <underline>j</underline>oint <underline>e</underline>stimator of <underline>m</underline>ultiple sUGMs), that trains the multi-sUGMs over a massive network encompassing various local devices and the global center. In order to efficiently process the datasets with different nonparanormal distributions, the proposed federated algorithm fully exploits the computing power of the local devices and cloud center while federated updates ensure that personal data remain local, thus defending the privacy. We also implement a general federated learning framework for multi-task learning based on our method. We apply our method on multiple simulation datasets to evaluate its speed and accuracy in comparison with relevant baselines and develop a strategy accordingly to balance its computation and communication abilities. Finally, we predict several informative groups of connectomes based on the real-world dataset.https://ieeexplore.ieee.org/document/9493203/Federated learningmulti-task learninggraphical model |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Xiao Tan Tianyi Ma Tongtong Su |
spellingShingle |
Xiao Tan Tianyi Ma Tongtong Su Fast and Privacy-Preserving Federated Joint Estimator of Multi-sUGMs IEEE Access Federated learning multi-task learning graphical model |
author_facet |
Xiao Tan Tianyi Ma Tongtong Su |
author_sort |
Xiao Tan |
title |
Fast and Privacy-Preserving Federated Joint Estimator of Multi-sUGMs |
title_short |
Fast and Privacy-Preserving Federated Joint Estimator of Multi-sUGMs |
title_full |
Fast and Privacy-Preserving Federated Joint Estimator of Multi-sUGMs |
title_fullStr |
Fast and Privacy-Preserving Federated Joint Estimator of Multi-sUGMs |
title_full_unstemmed |
Fast and Privacy-Preserving Federated Joint Estimator of Multi-sUGMs |
title_sort |
fast and privacy-preserving federated joint estimator of multi-sugms |
publisher |
IEEE |
series |
IEEE Access |
issn |
2169-3536 |
publishDate |
2021-01-01 |
description |
Learning multiple related graphs from many distributed and privacy-required resources is an important and common task in neuroscience applications. Medical researchers can comprehensively investigate the diagnostic evidence and understand the cause of certain brain diseases via analyzing the commonalities and differences of the brain connectomes predicted from the fMRI data across multiple hospitals. Previous sparse Undirected Graphical Model (sUGM) methods either cannot take full usage of the heterogeneous data while preserving privacy or miss the capability of handling the nonparanormal data, which is highly non-independent and identically distributed (non-i.i.d.). This paper proposes a novel and efficient approach, FEDJEM (<underline>fed</underline>erated <underline>j</underline>oint <underline>e</underline>stimator of <underline>m</underline>ultiple sUGMs), that trains the multi-sUGMs over a massive network encompassing various local devices and the global center. In order to efficiently process the datasets with different nonparanormal distributions, the proposed federated algorithm fully exploits the computing power of the local devices and cloud center while federated updates ensure that personal data remain local, thus defending the privacy. We also implement a general federated learning framework for multi-task learning based on our method. We apply our method on multiple simulation datasets to evaluate its speed and accuracy in comparison with relevant baselines and develop a strategy accordingly to balance its computation and communication abilities. Finally, we predict several informative groups of connectomes based on the real-world dataset. |
topic |
Federated learning multi-task learning graphical model |
url |
https://ieeexplore.ieee.org/document/9493203/ |
work_keys_str_mv |
AT xiaotan fastandprivacypreservingfederatedjointestimatorofmultisugms AT tianyima fastandprivacypreservingfederatedjointestimatorofmultisugms AT tongtongsu fastandprivacypreservingfederatedjointestimatorofmultisugms |
_version_ |
1721248042442555392 |