S Yang, L Liu, M Xu - arXiv preprint arXiv:2101.06395, 2021 - arxiv.org
Learning from a limited number of samples is challenging since the learned model can easily become overfitted based on the biased distribution formed by only a few training …
Existing few-shot learning (FSL) methods rely on training with a large labeled dataset, which prevents them from leveraging abundant unlabeled data. From an information-theoretic …
Few-shot or one-shot learning of classifiers requires a significant inductive bias towards the type of task to be learned. One way to acquire this is by meta-learning on tasks similar to the …
J Rajendran, A Irpan, E Jang - Advances in Neural …, 2020 - proceedings.neurips.cc
Meta-learning algorithms aim to learn two components: a model that predicts targets for a task, and a base learner that updates that model when given examples from a new task. This …
Conventional image classifiers are trained by randomly sampling mini-batches of images. To achieve state-of-the-art performance, practitioners use sophisticated data augmentation …
Supervised machine learning has several drawbacks that make it difficult to use in many situations. Drawbacks include heavy reliance on massive training data, limited …
Pre-trained vision-language models have inspired much research on few-shot learning. However, with only a few training images, there exist two crucial problems:(1) the visual …
Meta-learning algorithms use past experience to learn to quickly solve new tasks. In the context of reinforcement learning, meta-learning algorithms acquire reinforcement learning …
G Bukchin, E Schwartz, K Saenko… - Proceedings of the …, 2021 - openaccess.thecvf.com
Few-shot learning methods offer pre-training techniques optimized for easier later adaptation of the model to new classes (unseen during training) using one or a few …