User-friendly introduction to PAC-Bayes bounds

P Alquier - Foundations and Trends® in Machine Learning, 2024 - nowpublishers.com
Aggregated predictors are obtained by making a set of basic predictors vote according to
some weights, that is, to some probability distribution. Randomized predictors are obtained …

Generalization bounds: Perspectives from information theory and PAC-Bayes

F Hellström, G Durisi, B Guedj… - … and Trends® in …, 2025 - nowpublishers.com
A fundamental question in theoretical machine learning is generalization. Over the past
decades, the PAC-Bayesian approach has been established as a flexible framework to …

A unified recipe for deriving (time-uniform) PAC-Bayes bounds

B Chugg, H Wang, A Ramdas - Journal of Machine Learning Research, 2023 - jmlr.org
We present a unified framework for deriving PAC-Bayesian generalization bounds. Unlike
most previous literature on this topic, our bounds are anytime-valid (ie, time-uniform) …

When is memorization of irrelevant training data necessary for high-accuracy learning?

G Brown, M Bun, V Feldman, A Smith… - Proceedings of the 53rd …, 2021 - dl.acm.org
Modern machine learning models are complex and frequently encode surprising amounts of
information about individual inputs. In extreme cases, complex models appear to memorize …

Statistical indistinguishability of learning algorithms

A Kalavasis, A Karbasi, S Moran… - … on Machine Learning, 2023 - proceedings.mlr.press
When two different parties use the same learning rule on their own data, how can we test
whether the distributions of the two outcomes are similar? In this paper, we study the …

Towards a unified information-theoretic framework for generalization

M Haghifam, GK Dziugaite… - Advances in Neural …, 2021 - proceedings.neurips.cc
In this work, we investigate the expressiveness of the" conditional mutual information"(CMI)
framework of Steinke and Zakynthinou (2020) and the prospect of using it to provide a …

Sample-conditioned hypothesis stability sharpens information-theoretic generalization bounds

Z Wang, Y Mao - Advances in Neural Information …, 2024 - proceedings.neurips.cc
We present new information-theoretic generalization guarantees through the a novel
construction of the" neighboring-hypothesis" matrix and a new family of stability notions …

The bayesian stability zoo

S Moran, H Schefler, J Shafer - Advances in Neural …, 2023 - proceedings.neurips.cc
We show that many definitions of stability found in the learning theory literature are
equivalent to one another. We distinguish between two families of definitions of stability …

Limitations of information-theoretic generalization bounds for gradient descent methods in stochastic convex optimization

M Haghifam, B Rodríguez-Gálvez… - International …, 2023 - proceedings.mlr.press
To date, no “information-theoretic” frameworks for reasoning about generalization error have
been shown to establish minimax rates for gradient descent in the setting of stochastic …

Integral probability metrics PAC-bayes bounds

R Amit, B Epstein, S Moran… - Advances in Neural …, 2022 - proceedings.neurips.cc
We present a PAC-Bayes-style generalization bound which enables the replacement of the
KL-divergence with a variety of Integral Probability Metrics (IPM). We provide instances of …