Generalization bounds: Perspectives from information theory and PAC-Bayes

F Hellström, G Durisi, B Guedj… - … and Trends® in …, 2025 - nowpublishers.com
A fundamental question in theoretical machine learning is generalization. Over the past
decades, the PAC-Bayesian approach has been established as a flexible framework to …

Meta-learning approaches for learning-to-learn in deep learning: A survey

Y Tian, X Zhao, W Huang - Neurocomputing, 2022 - Elsevier
Compared to traditional machine learning, deep learning can learn deeper abstract data
representation and understand scattered data properties. It has gained considerable …

Out-of-domain robustness via targeted augmentations

I Gao, S Sagawa, PW Koh… - International …, 2023 - proceedings.mlr.press
Abstract Models trained on one set of domains often suffer performance drops on unseen
domains, eg, when wildlife monitoring models are deployed in new camera locations. In this …

Generalization bounds for meta-learning: An information-theoretic analysis

Q Chen, C Shui, M Marchand - Advances in Neural …, 2021 - proceedings.neurips.cc
We derive a novel information-theoretic analysis of the generalization property of meta-
learning algorithms. Concretely, our analysis proposes a generic understanding in both the …

Learning with limited samples: Meta-learning and applications to communication systems

L Chen, ST Jose, I Nikoloska, S Park… - … and Trends® in …, 2023 - nowpublishers.com
Deep learning has achieved remarkable success in many machine learning tasks such as
image classification, speech recognition, and game playing. However, these breakthroughs …

Scalable PAC-bayesian meta-learning via the PAC-optimal hyper-posterior: from theory to practice

J Rothfuss, M Josifoski, V Fortuin… - The Journal of Machine …, 2023 - dl.acm.org
Meta-Learning aims to speed up the learning process on new tasks by acquiring useful
inductive biases from datasets of related learning tasks. While, in practice, the number of …

Understanding benign overfitting in gradient-based meta learning

L Chen, S Lu, T Chen - Advances in neural information …, 2022 - proceedings.neurips.cc
Meta learning has demonstrated tremendous success in few-shot learning with limited
supervised data. In those settings, the meta model is usually overparameterized. While the …

Individually conditional individual mutual information bound on generalization error

R Zhou, C Tian, T Liu - IEEE Transactions on Information …, 2022 - ieeexplore.ieee.org
We propose an information-theoretic bound on the generalization error based on a
combination of the error decomposition technique of Bu et al. and the conditional mutual …

Online meta-learning for hybrid model-based deep receivers

T Raviv, S Park, O Simeone, YC Eldar… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Recent years have witnessed growing interest in the application of deep neural networks
(DNNs) for receiver design, which can potentially be applied in complex environments …

Evaluated CMI bounds for meta learning: Tightness and expressiveness

F Hellström, G Durisi - Advances in Neural Information …, 2022 - proceedings.neurips.cc
Recent work has established that the conditional mutual information (CMI) framework of
Steinke and Zakynthinou (2020) is expressive enough to capture generalization guarantees …