An information-theoretic analysis of the impact of task similarity on meta-learning

ST Jose, O Simeone - 2021 IEEE International Symposium on …, 2021 - ieeexplore.ieee.org
Meta-learning aims at optimizing the hyperparameters of a model class or training algorithm
from the observation of data from a number of related tasks. Following the setting of Baxter …

Conditional mutual information-based generalization bound for meta learning

A Rezazadeh, ST Jose, G Durisi… - 2021 IEEE International …, 2021 - ieeexplore.ieee.org
Meta-learning optimizes an inductive bias—typically in the form of the hyperparameters of a
base-learning algorithm—by observing data from a finite number of related tasks. This paper …

Transfer meta-learning: Information-theoretic bounds and information meta-risk minimization

ST Jose, O Simeone, G Durisi - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
Meta-learning automatically infers an inductive bias by observing data from a number of
related tasks. The inductive bias is encoded by hyperparameters that determine aspects of …

Task relatedness-based generalization bounds for meta learning

J Guan, Z Lu - International Conference on Learning …, 2022 - openreview.net
Supposing the $ n $ training tasks and the new task are sampled from the same
environment, traditional meta learning theory derives an error bound on the expected loss …

Nonlinear meta-learning can guarantee faster rates

D Meunier, Z Li, A Gretton, S Kpotufe - arXiv preprint arXiv:2307.10870, 2023 - arxiv.org
Many recent theoretical works on\emph {meta-learning} aim to achieve guarantees in
leveraging similar representational structures from related tasks towards simplifying a target …

Information-theoretic generalization bounds for meta-learning and applications

ST Jose, O Simeone - Entropy, 2021 - mdpi.com
Meta-learning, or “learning to learn”, refers to techniques that infer an inductive bias from
data corresponding to multiple related tasks with the goal of improving the sample efficiency …

Evaluated CMI bounds for meta learning: Tightness and expressiveness

F Hellström, G Durisi - Advances in Neural Information …, 2022 - proceedings.neurips.cc
Recent work has established that the conditional mutual information (CMI) framework of
Steinke and Zakynthinou (2020) is expressive enough to capture generalization guarantees …

A general framework for PAC-Bayes bounds for meta-learning

A Rezazadeh - arXiv preprint arXiv:2206.05454, 2022 - arxiv.org
Meta learning automatically infers an inductive bias, that includes the hyperparameter of the
base-learning algorithm, by observing data from a finite number of related tasks. This paper …

Weighted meta-learning

D Cai, R Sheth, L Mackey, N Fusi - arXiv preprint arXiv:2003.09465, 2020 - arxiv.org
Meta-learning leverages related source tasks to learn an initialization that can be quickly
fine-tuned to a target task with limited labeled examples. However, many popular meta …

A unified view on pac-bayes bounds for meta-learning

A Rezazadeh - International Conference on Machine …, 2022 - proceedings.mlr.press
Meta learning automatically infers an inductive bias, that includes the hyperparameter of the
baselearning algorithm, by observing data from a finite number of related tasks. This paper …