When maml can adapt fast and how to assist when it cannot

S Arnold, S Iqbal, F Sha - International conference on …, 2021 - proceedings.mlr.press
Abstract Model-Agnostic Meta-Learning (MAML) and its variants have achieved success in
meta-learning tasks on many datasets and settings. Nonetheless, we have just started to …

Convergence of meta-learning with task-specific adaptation over partial parameters

K Ji, JD Lee, Y Liang, HV Poor - Advances in Neural …, 2020 - proceedings.neurips.cc
Although model-agnostic meta-learning (MAML) is a very successful algorithm in meta-
learning practice, it can have high computational cost because it updates all model …

Understanding transfer learning and gradient-based meta-learning techniques

M Huisman, A Plaat, JN van Rijn - Machine Learning, 2024 - Springer
Deep neural networks can yield good performance on various tasks but often require large
amounts of data to train them. Meta-learning received considerable attention as one …

On modulating the gradient for meta-learning

C Simon, P Koniusz, R Nock, M Harandi - Computer Vision–ECCV 2020 …, 2020 - Springer
Inspired by optimization techniques, we propose a novel meta-learning algorithm with
gradient modulation to encourage fast-adaptation of neural networks in the absence of …

Adaptation: Blessing or Curse for Higher-way Meta-learning

A Aimen, S Sidheekh, B Ladrecha… - IEEE Transactions …, 2023 - ieeexplore.ieee.org
The prevailing literature typically assesses the effectiveness of meta-learning (ML)
approaches on tasks that involve no more than 20 classes. However, we challenge this …

Maml is a noisy contrastive learner in classification

CH Kao, WC Chiu, PY Chen - arXiv preprint arXiv:2106.15367, 2021 - arxiv.org
Model-agnostic meta-learning (MAML) is one of the most popular and widely adopted meta-
learning algorithms, achieving remarkable success in various learning problems. Yet, with …

Provable generalization of overparameterized meta-learning trained with sgd

Y Huang, Y Liang, L Huang - Advances in Neural …, 2022 - proceedings.neurips.cc
Despite the empirical success of deep meta-learning, theoretical understanding of
overparameterized meta-learning is still limited. This paper studies the generalization of a …

Meta-learning without memorization

M Yin, G Tucker, M Zhou, S Levine, C Finn - arXiv preprint arXiv …, 2019 - arxiv.org
The ability to learn new concepts with small amounts of data is a critical aspect of
intelligence that has proven challenging for deep learning methods. Meta-learning has …

Task-robust model-agnostic meta-learning

L Collins, A Mokhtari… - Advances in Neural …, 2020 - proceedings.neurips.cc
Meta-learning methods have shown an impressive ability to train models that rapidly learn
new tasks. However, these methods only aim to perform well in expectation over tasks …

Meta-learning with neural tangent kernels

Y Zhou, Z Wang, J Xian, C Chen, J Xu - arXiv preprint arXiv:2102.03909, 2021 - arxiv.org
Model Agnostic Meta-Learning (MAML) has emerged as a standard framework for meta-
learning, where a meta-model is learned with the ability of fast adapting to new tasks …