Differentially private synthetic data via foundation model apis 1: Images

Z Lin, S Gopi, J Kulkarni, H Nori, S Yekhanin - arXiv preprint arXiv …, 2023 - arxiv.org
Generating differentially private (DP) synthetic data that closely resembles the original
private data is a scalable way to mitigate privacy concerns in the current data-driven world …

{PrivImage}: Differentially Private Synthetic Image Generation using Diffusion Models with {Semantic-Aware} Pretraining

K Li, C Gong, Z Li, Y Zhao, X Hou, T Wang - 33rd USENIX Security …, 2024 - usenix.org
Differential Privacy (DP) image data synthesis, which leverages the DP technique to
generate synthetic data to replace the sensitive data, allowing organizations to share and …

Differentially private latent diffusion models

S Lyu, MF Liu, M Vinaroz, M Park - arXiv preprint arXiv:2305.15759, 2023 - arxiv.org
Diffusion models (DMs) are widely used for generating high-quality high-dimensional
images in a non-differentially private manner. To address this challenge, recent papers …

Differentially private neural tangent kernels for privacy-preserving data generation

Y Yang, K Adamczewski, DJ Sutherland, X Li… - arXiv preprint arXiv …, 2023 - arxiv.org
Maximum mean discrepancy (MMD) is a particularly useful distance metric for differentially
private data generation: when used with finite-dimensional features it allows us to …

Meticulously selecting 1% of the dataset for pre-training! generating differentially private images data with semantics query

K Li, C Gong, Z Li, Y Zhao, X Hou, T Wang - arXiv preprint arXiv …, 2023 - arxiv.org
Differential Privacy (DP) image data synthesis, which leverages the DP technique to
generate synthetic data to replace the sensitive data, allowing organizations to share and …

[PDF][PDF] dp-promise: Differentially Private Diffusion Probabilistic Models for Image Synthesis

H Wang, S Pang, Z Lu, Y Rao, Y Zhou, M Xue - 2024 - usenix.org
Utilizing sensitive images (eg, human faces) for training DL models raises privacy concerns.
One straightforward solution is to replace the private images with synthetic ones generated …

Differentially private gradient flow based on the sliced wasserstein distance for non-parametric generative modeling

I Sebag, MS Pydi, JY Franceschi… - arXiv preprint arXiv …, 2023 - arxiv.org
Safeguarding privacy in sensitive training data is paramount, particularly in the context of
generative modeling. This is done through either differentially private stochastic gradient …

DP-RDM: Adapting Diffusion Models to Private Domains Without Fine-Tuning

J Lebensold, M Sanjabi, P Astolfi… - arXiv preprint arXiv …, 2024 - arxiv.org
Text-to-image diffusion models have been shown to suffer from sample-level memorization,
possibly reproducing near-perfect replica of images that they are trained on, which may be …

DPAF: Image Synthesis via Differentially Private Aggregation in Forward Phase

CH Lin, CY Hsu, CM Yu, Y Cao, CY Huang - arXiv preprint arXiv …, 2023 - arxiv.org
Differentially private synthetic data is a promising alternative for sensitive data release. Many
differentially private generative models have been proposed in the literature. Unfortunately …

Clip Body and Tail Separately: High Probability Guarantees for DPSGD with Heavy Tails

H Sha, Y Cao, Y Liu, Y Wu, R Liu, H Chen - arXiv preprint arXiv …, 2024 - arxiv.org
Differentially Private Stochastic Gradient Descent (DPSGD) is widely utilized to preserve
training data privacy in deep learning, which first clips the gradients to a predefined norm …