In this work, we aim to characterize the statistical complexity of realizable regression both in the PAC learning setting and the online learning setting. Previous work had established the …
Deriving generalization bounds for stable algorithms is a classical question in learning theory taking its roots in the early works by Vapnik and Chervonenkis (1974) and Rogers …
In this work, we investigate the expressiveness of the" conditional mutual information"(CMI) framework of Steinke and Zakynthinou (2020) and the prospect of using it to provide a …
The sharpest known high probability generalization bounds for uniformly stable algorithms (Feldman, Vondrak, NeurIPS 2018, COLT, 2019),(Bousquet, Klochkov, Zhivotovskiy, COLT …
Z Wang, Y Mao - Advances in Neural Information …, 2024 - proceedings.neurips.cc
We present new information-theoretic generalization guarantees through the a novel construction of the" neighboring-hypothesis" matrix and a new family of stability notions …
M Sefidgaran, A Zaidi… - Advances in Neural …, 2024 - proceedings.neurips.cc
A major challenge in designing efficient statistical supervised learning algorithms is finding representations that perform well not only on available training samples but also on unseen …
MC Campi, S Garatti - Journal of Machine Learning Research, 2023 - jmlr.org
A compression function is a map that slims down an observational set into a subset of reduced size, while preserving its informational content. In multiple applications, the …
O Montasser, S Hanneke… - Conference on Learning …, 2021 - proceedings.mlr.press
We study the problem of learning predictors that are robust to adversarial examples with respect to an unknown perturbation set, relying instead on interaction with an adversarial …
D Paccagnan, M Campi… - Advances in Neural …, 2024 - proceedings.neurips.cc
Generalization bounds are valuable both for theory and applications. On the one hand, they shed light on the mechanisms that underpin the learning processes; on the other, they certify …