Learning Over Multitask Graphs—Part I: Stability Analysis
This paper formulates a multitask optimization problem where agents in the network have individual objectives to meet, or individual parameter vectors to estimate, subject to a smoothness condition over the graph. The smoothness condition softens the transition in the tasks among adjacent nodes and...
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2020-01-01
|
Series: | IEEE Open Journal of Signal Processing |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9075197/ |
id |
doaj-a0cd13ef360445e2a0bcc361cb2c396c |
---|---|
record_format |
Article |
spelling |
doaj-a0cd13ef360445e2a0bcc361cb2c396c2021-03-29T18:07:58ZengIEEEIEEE Open Journal of Signal Processing2644-13222020-01-011284510.1109/OJSP.2020.29890389075197Learning Over Multitask Graphs—Part I: Stability AnalysisRoula Nassif0https://orcid.org/0000-0001-9663-8559Stefan Vlaski1https://orcid.org/0000-0002-0616-3076Cedric Richard2https://orcid.org/0000-0003-2890-141XAli H. Sayed3https://orcid.org/0000-0002-5125-5519Institute of Electrical Engineering, École Polytechnique Fédérale de Lausanne (EPFL), Lausanne, SwitzerlandInstitute of Electrical Engineering, École Polytechnique Fédérale de Lausanne (EPFL), Lausanne, SwitzerlandUniversité de Nice Sophia-Antipolis, Nice, FranceInstitute of Electrical Engineering, École Polytechnique Fédérale de Lausanne (EPFL), Lausanne, SwitzerlandThis paper formulates a multitask optimization problem where agents in the network have individual objectives to meet, or individual parameter vectors to estimate, subject to a smoothness condition over the graph. The smoothness condition softens the transition in the tasks among adjacent nodes and allows incorporating information about the graph structure into the solution of the inference problem. A diffusion strategy is devised that responds to streaming data and employs stochastic approximations in place of actual gradient vectors, which are generally unavailable. The approach relies on minimizing a global cost consisting of the aggregate sum of individual costs regularized by a term that promotes smoothness. We show in this Part I of the work, under conditions on the step-size parameter, that the adaptive strategy induces a contraction mapping and leads to small estimation errors on the order of the small step-size. The results in the accompanying Part II will reveal explicitly the influence of the network topology and the regularization strength on the network performance and will provide insights into the design of effective multitask strategies for distributed inference over networks.https://ieeexplore.ieee.org/document/9075197/Multitask distributed inferencediffusion strategysmoothness priorgraph Laplacian regularizationgradient noisestability analysis |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Roula Nassif Stefan Vlaski Cedric Richard Ali H. Sayed |
spellingShingle |
Roula Nassif Stefan Vlaski Cedric Richard Ali H. Sayed Learning Over Multitask Graphs—Part I: Stability Analysis IEEE Open Journal of Signal Processing Multitask distributed inference diffusion strategy smoothness prior graph Laplacian regularization gradient noise stability analysis |
author_facet |
Roula Nassif Stefan Vlaski Cedric Richard Ali H. Sayed |
author_sort |
Roula Nassif |
title |
Learning Over Multitask Graphs—Part I: Stability Analysis |
title_short |
Learning Over Multitask Graphs—Part I: Stability Analysis |
title_full |
Learning Over Multitask Graphs—Part I: Stability Analysis |
title_fullStr |
Learning Over Multitask Graphs—Part I: Stability Analysis |
title_full_unstemmed |
Learning Over Multitask Graphs—Part I: Stability Analysis |
title_sort |
learning over multitask graphs—part i: stability analysis |
publisher |
IEEE |
series |
IEEE Open Journal of Signal Processing |
issn |
2644-1322 |
publishDate |
2020-01-01 |
description |
This paper formulates a multitask optimization problem where agents in the network have individual objectives to meet, or individual parameter vectors to estimate, subject to a smoothness condition over the graph. The smoothness condition softens the transition in the tasks among adjacent nodes and allows incorporating information about the graph structure into the solution of the inference problem. A diffusion strategy is devised that responds to streaming data and employs stochastic approximations in place of actual gradient vectors, which are generally unavailable. The approach relies on minimizing a global cost consisting of the aggregate sum of individual costs regularized by a term that promotes smoothness. We show in this Part I of the work, under conditions on the step-size parameter, that the adaptive strategy induces a contraction mapping and leads to small estimation errors on the order of the small step-size. The results in the accompanying Part II will reveal explicitly the influence of the network topology and the regularization strength on the network performance and will provide insights into the design of effective multitask strategies for distributed inference over networks. |
topic |
Multitask distributed inference diffusion strategy smoothness prior graph Laplacian regularization gradient noise stability analysis |
url |
https://ieeexplore.ieee.org/document/9075197/ |
work_keys_str_mv |
AT roulanassif learningovermultitaskgraphsx2014partistabilityanalysis AT stefanvlaski learningovermultitaskgraphsx2014partistabilityanalysis AT cedricrichard learningovermultitaskgraphsx2014partistabilityanalysis AT alihsayed learningovermultitaskgraphsx2014partistabilityanalysis |
_version_ |
1724196746107551744 |