A Lowy, M Razaviyayn - arXiv preprint arXiv:2106.09779, 2021 - arxiv.org
This paper studies federated learning (FL)--especially cross-silo FL--with data from people who do not trust the server or other silos. In this setting, each silo (eg hospital) has data from …
We examine the robustness and privacy of Bayesian inference, under assumptions on the prior, and with no modifications to the Bayesian framework. First, we generalise the concept …
We consider three different variants of differential privacy (DP), namely approximate DP, Rényi DP (RDP), and hypothesis test DP. In the first part, we develop a machinery for …
A Triastcyn, B Faltings - International Conference on …, 2020 - proceedings.mlr.press
Traditional differential privacy is independent of the data distribution. However, this is not well-matched with the modern machine learning context, where models are trained on …
N Papernot, A Thakurta, S Song, S Chien… - Proceedings of the …, 2021 - ojs.aaai.org
Because learning sometimes involves sensitive data, machine learning algorithms have been extended to offer differential privacy for training data. In practice, this has been mostly …
JT Wang, S Mahloujifar, S Wang… - Advances in Neural …, 2022 - proceedings.neurips.cc
Abstract Propose-Test-Release (PTR) is a differential privacy framework that works with local sensitivity of functions, instead of their global sensitivity. This framework is typically …
Most works in learning with differential privacy (DP) have focused on the setting where each user has a single sample. In this work, we consider the setting where each user holds $ m …
In this paper we prove that the sample complexity of properly learning a class of Littlestone dimension d with approximate differential privacy is Õ (d 6), ignoring privacy and accuracy …
Federated learning (FL) is a common and practical framework for learning a machine model in a decentralized fashion. A primary motivation behind this decentralized approach is data …