Robust estimators in high-dimensions without the computational intractability

I Diakonikolas, G Kamath, D Kane, J Li, A Moitra… - SIAM Journal on …, 2019 - SIAM
We study high-dimensional distribution learning in an agnostic setting where an adversary is
allowed to arbitrarily corrupt an ε-fraction of the samples. Such questions have a rich history …

Being robust (in high dimensions) can be practical

I Diakonikolas, G Kamath, DM Kane… - International …, 2017 - proceedings.mlr.press
Robust estimation is much more challenging in high-dimensions than it is in one-dimension:
Most techniques either lead to intractable optimization problems or estimators that can …

Statistical query lower bounds for robust estimation of high-dimensional gaussians and gaussian mixtures

I Diakonikolas, DM Kane… - 2017 IEEE 58th Annual …, 2017 - ieeexplore.ieee.org
We describe a general technique that yields the first Statistical Query lower bounds for a
range of fundamental high-dimensional learning problems involving Gaussian distributions …

Mixture models, robustness, and sum of squares proofs

SB Hopkins, J Li - Proceedings of the 50th Annual ACM SIGACT …, 2018 - dl.acm.org
We use the Sum of Squares method to develop new efficient algorithms for learning well-
separated mixtures of Gaussians and robust mean estimation, both in high dimensions, that …

Hadamard response: Estimating distributions privately, efficiently, and with little communication

J Acharya, Z Sun, H Zhang - The 22nd International …, 2019 - proceedings.mlr.press
We study the problem of estimating $ k $-ary distributions under $\eps $-local differential
privacy. $ n $ samples are distributed across users who send privatized versions of their …

Differentially private release and learning of threshold functions

M Bun, K Nissim, U Stemmer… - 2015 IEEE 56th Annual …, 2015 - ieeexplore.ieee.org
We prove new upper and lower bounds on the sample complexity of (ε, δ) differentially
private algorithms for releasing approximate answers to threshold functions. A threshold …

Unraveling the smoothness properties of diffusion models: A gaussian mixture perspective

Y Liang, Z Shi, Z Song, Y Zhou - arXiv preprint arXiv:2405.16418, 2024 - arxiv.org
Diffusion models have made rapid progress in generating high-quality samples across
various domains. However, a theoretical understanding of the Lipschitz continuity and …

Private mean estimation of heavy-tailed distributions

G Kamath, V Singhal, J Ullman - Conference on Learning …, 2020 - proceedings.mlr.press
We give new upper and lower bounds on the minimax sample complexity of differentially
private mean estimation of distributions with bounded $ k $-th moments. Roughly speaking …

Robustly learning mixtures of k arbitrary Gaussians

A Bakshi, I Diakonikolas, H Jia, DM Kane… - Proceedings of the 54th …, 2022 - dl.acm.org
We give a polynomial-time algorithm for the problem of robustly estimating a mixture of k
arbitrary Gaussians in ℝ d, for any fixed k, in the presence of a constant fraction of arbitrary …

Robustly learning a gaussian: Getting optimal error, efficiently

I Diakonikolas, G Kamath, DM Kane, J Li, A Moitra… - Proceedings of the …, 2018 - SIAM
We study the fundamental problem of learning the parameters of a high-dimensional
Gaussian in the presence of noise—where an ε-fraction of our samples were chosen by an …