A fundamental question in theoretical machine learning is generalization. Over the past decades, the PAC-Bayesian approach has been established as a flexible framework to …
To date, no “information-theoretic” frameworks for reasoning about generalization error have been shown to establish minimax rates for gradient descent in the setting of stochastic …
Understanding generalization in modern machine learning settings has been one of the major challenges in statistical learning theory. In this context, recent years have witnessed …
The emergence of powerful deep learning architectures has resulted in breakthrough innovations in several fields such as healthcare, precision farming, banking, education, and …
We provide an information-theoretic analysis of the generalization ability of Gibbs-based transfer learning algorithms by focusing on two popular empirical risk minimization (ERM) …
Generalization error bounds are essential to understanding machine learning algorithms. This paper presents novel expected generalization error upper bounds based on the …
Various approaches have been developed to upper bound the generalization error of a supervised learning algorithm. However, existing bounds are often loose and even vacuous …
M Sefidgaran, A Zaidi - IEEE Transactions on Information …, 2024 - ieeexplore.ieee.org
In this paper, we establish novel data-dependent upper bounds on the generalization error through the lens of a “variable-size compressibility” framework that we introduce newly here …
The ability of machine learning (ML) algorithms to generalize well to unseen data has been studied through the lens of information theory, by bounding the generalization error with the …