Interaction is Necessary for Distributed Learning with Privacy or Communication Constraints
Local differential privacy (LDP) is a model where users send privatized data to an untrusted central server whose goal it to solve some data analysis task. In the non-interactive version of this model the protocol consists of a single round in which a server sends requests to all users then receive...
Main Authors: | Yuval Dagan, Vitaly Feldman |
---|---|
Format: | Article |
Language: | English |
Published: |
Labor Dynamics Institute
2021-09-01
|
Series: | The Journal of Privacy and Confidentiality |
Subjects: | |
Online Access: | https://journalprivacyconfidentiality.org/index.php/jpc/article/view/781 |
Similar Items
-
Compressive learning with privacy guarantees
by: Chatalic, A., et al.
Published: (2022) -
Differential privacy for the vast majority
by: Kartal, H.B, et al.
Published: (2019) -
Not All Attributes are Created Equal: dX -Private Mechanisms for Linear Queries
by: Kamalaruban Parameswaran, et al.
Published: (2020-01-01) -
Differential Privacy Preservation in Deep Learning: Challenges, Opportunities and Solutions
by: Jingwen Zhao, et al.
Published: (2019-01-01) -
Local Differential Privacy for Evolving Data
by: Matthew Joseph, et al.
Published: (2020-01-01)