User-level differentially private learning via correlated sampling

B Ghazi, R Kumar… - Advances in Neural …, 2021 - proceedings.neurips.cc
Most works in learning with differential privacy (DP) have focused on the setting where each
user has a single sample. In this work, we consider the setting where each user holds $ m …

Instance-optimal mean estimation under differential privacy

Z Huang, Y Liang, K Yi - Advances in Neural Information …, 2021 - proceedings.neurips.cc
Mean estimation under differential privacy is a fundamental problem, but worst-case optimal
mechanisms do not offer meaningful utility guarantees in practice when the global sensitivity …

Samplable anonymous aggregation for private federated data analysis

K Talwar, S Wang, A McMillan, V Feldman… - Proceedings of the …, 2024 - dl.acm.org
We revisit the problem of designing scalable protocols for private statistics and private
federated learning when each device holds its private data. Locally differentially private …

Locally private k-means in one round

A Chang, B Ghazi, R Kumar… - … on machine learning, 2021 - proceedings.mlr.press
We provide an approximation algorithm for k-means clustering in the\emph {one-
round}(aka\emph {non-interactive}) local model of differential privacy (DP). Our algorithm …

Private non-convex federated learning without a trusted server

A Lowy, A Ghafelebashi… - … Conference on Artificial …, 2023 - proceedings.mlr.press
We study federated learning (FL) with non-convex loss functions and data from people who
do not trust the server or other silos. In this setting, each silo (eg hospital) must protect the …

PPML-Omics: a privacy-preserving federated machine learning method protects patients' privacy in omic data

J Zhou, S Chen, Y Wu, H Li, B Zhang, L Zhou, Y Hu… - Science …, 2024 - science.org
Modern machine learning models toward various tasks with omic data analysis give rise to
threats of privacy leakage of patients involved in those datasets. Here, we proposed a …

Optimal Unbiased Randomizers for Regression with Label Differential Privacy

A Badanidiyuru Varadaraja, B Ghazi… - Advances in …, 2024 - proceedings.neurips.cc
We propose a new family of label randomizers for training regression models under the
constraint of label differential privacy (DP). In particular, we leverage the trade-offs between …

Network shuffling: Privacy amplification via random walks

SP Liew, T Takahashi, S Takagi, F Kato, Y Cao… - Proceedings of the …, 2022 - dl.acm.org
Recently, it is shown that shuffling can amplify the central differential privacy guarantees of
data randomized with local differential privacy. Within this setup, a centralized, trusted …

Differentially private federated learning with an adaptive noise mechanism

R Xue, K Xue, B Zhu, X Luo, T Zhang… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Federated Learning (FL) enables multiple distributed clients to collaboratively train a model
with owned datasets. To avoid the potential privacy threat in FL, researchers propose the DP …

Shuffle differential private data aggregation for random population

S Wang, X Luo, Y Qian, Y Zhu, K Chen… - … on Parallel and …, 2023 - ieeexplore.ieee.org
Bridging the advantages of differential privacy in both centralized model (ie, high accuracy)
and local model (ie, minimum trust), the shuffle privacy model has potential applications in …