Evograd: Efficient gradient-based meta-learning and hyperparameter optimization

O Bohdal, Y Yang… - Advances in neural …, 2021 - proceedings.neurips.cc
Gradient-based meta-learning and hyperparameter optimization have seen significant
progress recently, enabling practical end-to-end training of neural networks together with …

Meta-learning in neural networks: A survey

T Hospedales, A Antoniou, P Micaelli… - IEEE transactions on …, 2021 - ieeexplore.ieee.org
The field of meta-learning, or learning-to-learn, has seen a dramatic rise in interest in recent
years. Contrary to conventional approaches to AI where tasks are solved from scratch using …

Gradient-based meta-learning with learned layerwise metric and subspace

Y Lee, S Choi - International Conference on Machine …, 2018 - proceedings.mlr.press
Gradient-based meta-learning methods leverage gradient descent to learn the
commonalities among various tasks. While previous such methods have been successful in …

Convergence of meta-learning with task-specific adaptation over partial parameters

K Ji, JD Lee, Y Liang, HV Poor - Advances in Neural …, 2020 - proceedings.neurips.cc
Although model-agnostic meta-learning (MAML) is a very successful algorithm in meta-
learning practice, it can have high computational cost because it updates all model …

Meta learning and its applications to natural language processing

H Lee, NT Vu, SW Li - Proceedings of the 59th Annual Meeting of …, 2021 - aclanthology.org
Deep learning based natural language processing (NLP) has become the mainstream of
research in recent years and significantly outperforms conventional methods. However …

Metafun: Meta-learning with iterative functional updates

J Xu, JF Ton, H Kim, A Kosiorek… - … on Machine Learning, 2020 - proceedings.mlr.press
We develop a functional encoder-decoder approach to supervised meta-learning, where
labeled data is encoded into an infinite-dimensional functional representation rather than a …

Regularizing meta-learning via gradient dropout

HY Tseng, YW Chen, YH Tsai, S Liu… - Proceedings of the …, 2020 - openaccess.thecvf.com
With the growing attention on learning-to-learn new tasks using only a few examples, meta-
learning has been widely used in numerous problems such as few-shot classification …

SHOT: suppressing the hessian along the optimization trajectory for gradient-based meta-learning

JH Lee, J Yoo, N Kwak - Advances in Neural Information …, 2023 - proceedings.neurips.cc
In this paper, we hypothesize that gradient-based meta-learning (GBML) implicitly
suppresses the Hessian along the optimization trajectory in the inner loop. Based on this …

Towards well-generalizing meta-learning via adversarial task augmentation

H Wang, H Mai, Y Gong, ZH Deng - Artificial Intelligence, 2023 - Elsevier
Meta-learning aims to use the knowledge from previous tasks to facilitate the learning of
novel tasks. Many meta-learning models elaborately design various task-shared inductive …

Revisiting meta-learning as supervised learning

WL Chao, HJ Ye, DC Zhan, M Campbell… - arXiv preprint arXiv …, 2020 - arxiv.org
Recent years have witnessed an abundance of new publications and approaches on meta-
learning. This community-wide enthusiasm has sparked great insights but has also created …