We examine the relationship between learnability and robust learnability for the problem of distribution learning. We show that learnability implies robust learnability if the adversary …
We study the problem of robustly estimating the parameter $ p $ of an Erdős-Rényi random graph on $ n $ nodes, where a $\gamma $ fraction of nodes may be adversarially corrupted …
S Bhatt, G Fang, P Li… - … Conference on Machine …, 2022 - proceedings.mlr.press
We present a new finite-sample analysis of Catoni's M-estimator under adversarial contamination, where an adversary is allowed to corrupt a fraction of the samples arbitrarily …
G Kamath - arXiv preprint arXiv:2412.02670, 2024 - arxiv.org
The last decade has seen a number of advances in computationally efficient algorithms for statistical methods subject to robustness constraints. An estimator may be robust in a …
Y Luo, C Gao - arXiv preprint arXiv:2410.22647, 2024 - arxiv.org
This paper studies the construction of adaptive confidence intervals under Huber's contamination model when the contamination proportion is unknown. For the robust …
Approximating distributions from their samples is a canonical statistical-learning problem. One of its most powerful and successful modalities approximates every distribution to an …
Modern applications, including natural language processing, sensor networks, collaborative filtering, and federated learning, necessitate data collection from diverse sources. However …
Machine learning and statistical algorithms are now implemented at a large scale in almost every aspect of our society, significantly impacting our daily lives through their performance …
The fundamental theorem of statistical learning establishes the equivalence between various notions of both agnostic and realizable Probably Approximately Correct (PAC) …