A fundamental question in theoretical machine learning is generalization. Over the past decades, the PAC-Bayesian approach has been established as a flexible framework to …
We present a unified framework for deriving PAC-Bayesian generalization bounds. Unlike most previous literature on this topic, our bounds are anytime-valid (ie, time-uniform) …
Modern machine learning models are complex and frequently encode surprising amounts of information about individual inputs. In extreme cases, complex models appear to memorize …
When two different parties use the same learning rule on their own data, how can we test whether the distributions of the two outcomes are similar? In this paper, we study the …
In this work, we investigate the expressiveness of the" conditional mutual information"(CMI) framework of Steinke and Zakynthinou (2020) and the prospect of using it to provide a …
Z Wang, Y Mao - Advances in Neural Information …, 2024 - proceedings.neurips.cc
We present new information-theoretic generalization guarantees through the a novel construction of the" neighboring-hypothesis" matrix and a new family of stability notions …
We show that many definitions of stability found in the learning theory literature are equivalent to one another. We distinguish between two families of definitions of stability …
To date, no “information-theoretic” frameworks for reasoning about generalization error have been shown to establish minimax rates for gradient descent in the setting of stochastic …
R Amit, B Epstein, S Moran… - Advances in Neural …, 2022 - proceedings.neurips.cc
We present a PAC-Bayes-style generalization bound which enables the replacement of the KL-divergence with a variety of Integral Probability Metrics (IPM). We provide instances of …