A fundamental question in theoretical machine learning is generalization. Over the past decades, the PAC-Bayesian approach has been established as a flexible framework to …
Deep learning has achieved remarkable success in many machine learning tasks such as image classification, speech recognition, and game playing. However, these breakthroughs …
F Hellström, G Durisi - Advances in Neural Information …, 2022 - proceedings.neurips.cc
Recent work has established that the conditional mutual information (CMI) framework of Steinke and Zakynthinou (2020) is expressive enough to capture generalization guarantees …
L Chen, T Chen - International Conference on Artificial …, 2022 - proceedings.mlr.press
Meta learning aims at learning a model that can quickly adapt to unseen tasks. Widely used meta learning methods include model agnostic meta learning (MAML), implicit MAML …
Z Wang, Y Mao - arXiv preprint arXiv:2210.00706, 2022 - arxiv.org
This paper uses information-theoretic tools to analyze the generalization error in unsupervised domain adaptation (UDA). We present novel upper bounds for two notions of …
Bernstein's condition is a key assumption that guarantees fast rates in machine learning. For example, the Gibbs algorithm with prior $\pi $ has an excess risk in $ O (d_ {\pi}/n) $, as …
J Shu, D Meng, Z Xu - The Journal of Machine Learning Research, 2023 - dl.acm.org
Meta learning has attracted much attention recently in machine learning community. Contrary to conventional machine learning aiming to learn inherent prediction rules to …
PAC-Bayes has recently re-emerged as an effective theory with which one can derive principled learning algorithms with tight performance guarantees. However, applications of …
ST Jose, O Simeone - 2021 IEEE International Symposium on …, 2021 - ieeexplore.ieee.org
Meta-learning aims at optimizing the hyperparameters of a model class or training algorithm from the observation of data from a number of related tasks. Following the setting of Baxter …