Privacy-preserving distributed learning in the analog domain

M Soleymani, H Mahdavifar, AS Avestimehr - arXiv preprint arXiv …, 2020 - arxiv.org
We consider the critical problem of distributed learning over data while keeping it private
from the computational servers. The state-of-the-art approaches to this problem rely on …

Analog secret sharing with applications to private distributed learning

M Soleymani, H Mahdavifar… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
We consider the critical problems of distributed computing and learning over data while
keeping it private from the computational servers. The state-of-the-art approaches to this …

CodedPrivateML: A fast and privacy-preserving framework for distributed machine learning

J So, B Güler, AS Avestimehr - IEEE Journal on Selected Areas …, 2021 - ieeexplore.ieee.org
How to train a machine learning model while keeping the data private and secure? We
present CodedPrivateML, a fast and scalable approach to this critical problem …

Breaking the communication-privacy-accuracy trilemma

WN Chen, P Kairouz, A Ozgur - Advances in Neural …, 2020 - proceedings.neurips.cc
Two major challenges in distributed learning and estimation are 1) preserving the privacy of
the local samples; and 2) communicating them efficiently to a central server, while achieving …

Differential secrecy for distributed data and applications to robust differentially secure vector summation

K Talwar - arXiv preprint arXiv:2202.10618, 2022 - arxiv.org
Computing the noisy sum of real-valued vectors is an important primitive in differentially
private learning and statistics. In private federated learning applications, these vectors are …

Breaking the communication-privacy-accuracy trilemma

WN Chen, P Kairouz, A Özgür - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
Two major challenges in distributed learning and estimation are 1) preserving the privacy of
the local samples; and 2) communicating them efficiently to a central server, while achieving …

Layered Randomized Quantization for Communication-Efficient and Privacy-Preserving Distributed Learning

G Yan, T Li, T Lan, K Wu, L Song - arXiv preprint arXiv:2312.07060, 2023 - arxiv.org
Next-generation wireless networks, such as edge intelligence and wireless distributed
learning, face two critical challenges: communication efficiency and privacy protection. In …

Differentially private cross-silo federated learning

MA Heikkilä, A Koskela, K Shimizu, S Kaski… - arXiv preprint arXiv …, 2020 - arxiv.org
Strict privacy is of paramount importance in distributed machine learning. Federated
learning, with the main idea of communicating only what is needed for learning, has been …

Distributed learning without distress: Privacy-preserving empirical risk minimization

B Jayaraman, L Wang, D Evans… - Advances in Neural …, 2018 - proceedings.neurips.cc
Distributed learning allows a group of independent data owners to collaboratively learn a
model over their data sets without exposing their private data. We present a distributed …

Differentially private distributed online learning

C Li, P Zhou, L Xiong, Q Wang… - IEEE transactions on …, 2018 - ieeexplore.ieee.org
In the big data era, the generation of data presents some new characteristics, including wide
distribution, high velocity, high dimensionality, and privacy concern. To address these …