A comprehensive survey on regularization strategies in machine learning

Y Tian, Y Zhang - Information Fusion, 2022 - Elsevier
In machine learning, the model is not as complicated as possible. Good generalization
ability means that the model not only performs well on the training data set, but also can …

Artificial neural networks for neuroscientists: a primer

GR Yang, XJ Wang - Neuron, 2020 - cell.com
Artificial neural networks (ANNs) are essential tools in machine learning that have drawn
increasing attention in neuroscience. Besides offering powerful techniques for data analysis …

Meta-learning in neural networks: A survey

T Hospedales, A Antoniou, P Micaelli… - IEEE transactions on …, 2021 - ieeexplore.ieee.org
The field of meta-learning, or learning-to-learn, has seen a dramatic rise in interest in recent
years. Contrary to conventional approaches to AI where tasks are solved from scratch using …

Deepemd: Few-shot image classification with differentiable earth mover's distance and structured classifiers

C Zhang, Y Cai, G Lin, C Shen - Proceedings of the IEEE …, 2020 - openaccess.thecvf.com
In this paper, we address the few-shot classification task from a new perspective of optimal
matching between image regions. We adopt the Earth Mover's Distance (EMD) as a metric to …

Few-shot learning via embedding adaptation with set-to-set functions

HJ Ye, H Hu, DC Zhan, F Sha - Proceedings of the IEEE …, 2020 - openaccess.thecvf.com
Learning with limited data is a key challenge for visual recognition. Many few-shot learning
methods address this challenge by learning an instance embedding function from seen …

Automl-zero: Evolving machine learning algorithms from scratch

E Real, C Liang, D So, Q Le - International conference on …, 2020 - proceedings.mlr.press
Abstract Machine learning research has advanced in multiple aspects, including model
structures and learning methods. The effort to automate such research, known as AutoML …

Meta-learning representations for continual learning

K Javed, M White - Advances in neural information …, 2019 - proceedings.neurips.cc
The reviews had two major concerns: lack of a benchmarking on a complex dataset, and
unclear writing. To address these two major issues we: 1-Rewrote experiments section with …

Adaptive risk minimization: Learning to adapt to domain shift

M Zhang, H Marklund, N Dhawan… - Advances in …, 2021 - proceedings.neurips.cc
A fundamental assumption of most machine learning algorithms is that the training and test
data are drawn from the same underlying distribution. However, this assumption is violated …

Meta-learning with warped gradient descent

S Flennerhag, AA Rusu, R Pascanu, F Visin… - arXiv preprint arXiv …, 2019 - arxiv.org
Learning an efficient update rule from data that promotes rapid learning of new tasks from
the same distribution remains an open problem in meta-learning. Typically, previous works …

On the convergence theory of gradient-based model-agnostic meta-learning algorithms

A Fallah, A Mokhtari… - … Conference on Artificial …, 2020 - proceedings.mlr.press
We study the convergence of a class of gradient-based Model-Agnostic Meta-Learning
(MAML) methods and characterize their overall complexity as well as their best achievable …