B Knott, S Venkataraman, A Hannun… - Advances in …, 2021 - proceedings.neurips.cc
Secure multi-party computation (MPC) allows parties to perform computations on data while keeping that data private. This capability has great potential for machine-learning …
Differential privacy is a de facto privacy framework that has seen adoption in practice via a number of mature software platforms. Implementation of differentially private (DP) …
We consider training models on private data that are distributed across user devices. To ensure privacy, we add on-device noise and use secure aggregation so that only the noisy …
We introduce the multi-dimensional Skellam mechanism, a discrete differential privacy mechanism based on the difference of two independent Poisson random variables. To …
P Kairouz, B McMahan, S Song… - International …, 2021 - proceedings.mlr.press
We consider training models with differential privacy (DP) using mini-batch gradients. The existing state-of-the-art, Differentially Private Stochastic Gradient Descent (DP-SGD) …
JM Abowd, R Ashmead… - Harvard Data …, 2022 - assets.pubpub.org
ABSTRACT The Census TopDown Algorithm (TDA) is a disclosure avoidance system using differential privacy for privacy-loss accounting. The algorithm ingests the final, edited version …
Recent work of Erlingsson, Feldman, Mironov, Raghunathan, Talwar, and Thakurta 1 demonstrates that random shuffling amplifies differential privacy guarantees of locally …
For many differentially private algorithms, such as the prominent noisy stochastic gradient descent (DP-SGD), the analysis needed to bound the privacy leakage of a single training …
We consider the problem of training a $ d $ dimensional model with distributed differential privacy (DP) where secure aggregation (SecAgg) is used to ensure that the server only sees …