Federated learning aims to collaboratively train a strong global model by accessing users' locally trained models but not their own data. A crucial step is therefore to aggregate local …
In the privacy-utility tradeoff of a model trained on benchmark language and vision tasks, remarkable improvements have been widely reported when the model is pretrained on …
We explore the feasibility of using Generative Adversarial Networks (GANs) to automatically learn generative models to generate synthetic packet-and flow header traces for networking …
Differentially private SGD (DP-SGD) is one of the most popular methods for solving differentially private empirical risk minimization (ERM). Due to its noisy perturbation on each …
Federated learning (FL) enables clients to collaborate with a server to train a machine learning model. To ensure privacy, the server performs secure aggregation of updates from …
In this paper, we revisit the problem of using in-distribution public data to improve the privacy/utility trade-offs for differentially private (DP) model training.(Here, public data refers …
We study the problem of private distribution learning with access to public data. In this setup, which we refer to as* public-private learning*, the learner is given public and private …
We initiate the study of differentially private (DP) estimation with access to a small amount of public data. For private estimation of $ d $-dimensional Gaussians, we assume that the …
In many statistical problems, incorporating priors can significantly improve performance. However, the use of prior knowledge in differentially private query release has remained …