The value of collaboration in convex machine learning with differential privacy

N Wu, F Farokhi, D Smith… - 2020 IEEE Symposium on …, 2020 - ieeexplore.ieee.org
In this paper, we apply machine learning to distributed private data owned by multiple data
owners, entities with access to non-overlapping training datasets. We use noisy …

Differentially-private deep learning from an optimization perspective

L Xiang, J Yang, B Li - IEEE INFOCOM 2019-IEEE Conference …, 2019 - ieeexplore.ieee.org
With the amount of user data crowdsourced for data mining dramatically increasing, there is
an urgent need to protect the privacy of individuals. Differential privacy mechanisms are …

Differential privacy for deep and federated learning: A survey

A El Ouadrhiri, A Abdelhadi - IEEE access, 2022 - ieeexplore.ieee.org
Users' privacy is vulnerable at all stages of the deep learning process. Sensitive information
of users may be disclosed during data collection, during training, or even after releasing the …

Distributed learning without distress: Privacy-preserving empirical risk minimization

B Jayaraman, L Wang, D Evans… - Advances in Neural …, 2018 - proceedings.neurips.cc
Distributed learning allows a group of independent data owners to collaboratively learn a
model over their data sets without exposing their private data. We present a distributed …

Neither private nor fair: Impact of data imbalance on utility and fairness in differential privacy

T Farrand, F Mireshghallah, S Singh… - Proceedings of the 2020 …, 2020 - dl.acm.org
Deployment of deep learning in different fields and industries is growing day by day due to
its performance, which relies on the availability of data and compute. Data is often crowd …

A generic framework for privacy preserving deep learning

T Ryffel, A Trask, M Dahl, B Wagner, J Mancuso… - arXiv preprint arXiv …, 2018 - arxiv.org
We detail a new framework for privacy preserving deep learning and discuss its assets. The
framework puts a premium on ownership and secure processing of data and introduces a …

Adversary instantiation: Lower bounds for differentially private machine learning

M Nasr, S Songi, A Thakurta… - … IEEE Symposium on …, 2021 - ieeexplore.ieee.org
Differentially private (DP) machine learning allows us to train models on private data while
limiting data leakage. DP formalizes this data leakage through a cryptographic game, where …

Adaclip: Adaptive clipping for private sgd

V Pichapati, AT Suresh, FX Yu, SJ Reddi… - arXiv preprint arXiv …, 2019 - arxiv.org
Privacy preserving machine learning algorithms are crucial for learning models over user
data to protect sensitive information. Motivated by this, differentially private stochastic …

Differential privacy and machine learning: a survey and review

Z Ji, ZC Lipton, C Elkan - arXiv preprint arXiv:1412.7584, 2014 - arxiv.org
The objective of machine learning is to extract useful information from data, while privacy is
preserved by concealing information. Thus it seems hard to reconcile these competing …

Deep learning with differential privacy

M Abadi, A Chu, I Goodfellow, HB McMahan… - Proceedings of the …, 2016 - dl.acm.org
Machine learning techniques based on neural networks are achieving remarkable results in
a wide variety of domains. Often, the training of models requires large, representative …