User-friendly introduction to PAC-Bayes bounds

P Alquier - Foundations and Trends® in Machine Learning, 2024 - nowpublishers.com
Aggregated predictors are obtained by making a set of basic predictors vote according to
some weights, that is, to some probability distribution. Randomized predictors are obtained …

Generalization bounds: Perspectives from information theory and PAC-Bayes

F Hellström, G Durisi, B Guedj… - … and Trends® in …, 2025 - nowpublishers.com
A fundamental question in theoretical machine learning is generalization. Over the past
decades, the PAC-Bayesian approach has been established as a flexible framework to …

Learning with limited samples: Meta-learning and applications to communication systems

L Chen, ST Jose, I Nikoloska, S Park… - … and Trends® in …, 2023 - nowpublishers.com
Deep learning has achieved remarkable success in many machine learning tasks such as
image classification, speech recognition, and game playing. However, these breakthroughs …

Evaluated CMI bounds for meta learning: Tightness and expressiveness

F Hellström, G Durisi - Advances in Neural Information …, 2022 - proceedings.neurips.cc
Recent work has established that the conditional mutual information (CMI) framework of
Steinke and Zakynthinou (2020) is expressive enough to capture generalization guarantees …

Is Bayesian model-agnostic meta learning better than model-agnostic meta learning, provably?

L Chen, T Chen - International Conference on Artificial …, 2022 - proceedings.mlr.press
Meta learning aims at learning a model that can quickly adapt to unseen tasks. Widely used
meta learning methods include model agnostic meta learning (MAML), implicit MAML …

Information-theoretic analysis of unsupervised domain adaptation

Z Wang, Y Mao - arXiv preprint arXiv:2210.00706, 2022 - arxiv.org
This paper uses information-theoretic tools to analyze the generalization error in
unsupervised domain adaptation (UDA). We present novel upper bounds for two notions of …

Bayes meets bernstein at the meta level: an analysis of fast rates in meta-learning with pac-bayes

C Riou, P Alquier, BE Chérief-Abdellatif - arXiv preprint arXiv:2302.11709, 2023 - arxiv.org
Bernstein's condition is a key assumption that guarantees fast rates in machine learning. For
example, the Gibbs algorithm with prior $\pi $ has an excess risk in $ O (d_ {\pi}/n) $, as …

Learning an explicit hyper-parameter prediction function conditioned on tasks

J Shu, D Meng, Z Xu - The Journal of Machine Learning Research, 2023 - dl.acm.org
Meta learning has attracted much attention recently in machine learning community.
Contrary to conventional machine learning aiming to learn inherent prediction rules to …

PAC-Bayes bounds for bandit problems: A survey and experimental comparison

H Flynn, D Reeb, M Kandemir… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
PAC-Bayes has recently re-emerged as an effective theory with which one can derive
principled learning algorithms with tight performance guarantees. However, applications of …

An information-theoretic analysis of the impact of task similarity on meta-learning

ST Jose, O Simeone - 2021 IEEE International Symposium on …, 2021 - ieeexplore.ieee.org
Meta-learning aims at optimizing the hyperparameters of a model class or training algorithm
from the observation of data from a number of related tasks. Following the setting of Baxter …