We consider training models on private data that are distributed across user devices. To ensure privacy, we add on-device noise and use secure aggregation so that only the noisy …
We consider the problem of training a $ d $ dimensional model with distributed differential privacy (DP) where secure aggregation (SecAgg) is used to ensure that the server only sees …
Differential privacy has been a de facto privacy standard in defining privacy and handling privacy preservation. It has had great success in scenarios of local data privacy and …
A Cheu - arXiv preprint arXiv:2107.11839, 2021 - arxiv.org
Differential privacy is often studied in one of two models. In the central model, a single analyzer has the responsibility of performing a privacy-preserving computation on data. But …
X Ma, X Sun, Y Wu, Z Liu, X Chen… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
Federated learning is a collaborative machine learning framework where a global model is trained by different organizations under the privacy restrictions. Promising as it is, privacy …
Federated Learning (FL) is a promising machine learning paradigm that enables the analyzer to train a model without collecting users' raw data. To ensure users' privacy …
It is well-known that general secure multi-party computation can in principle be applied to implement differentially private mechanisms over distributed data with utility matching the …
B Ghazi, R Kumar… - Advances in Neural …, 2021 - proceedings.neurips.cc
Most works in learning with differential privacy (DP) have focused on the setting where each user has a single sample. In this work, we consider the setting where each user holds $ m …
We consider the computation of sparse,(ε, ϑ)-differentially private~(DP) histograms in the two-server model of secure multi-party computation~(MPC), which has recently gained …