Coscl: Cooperation of small continual learners is stronger than a big one

L Wang, X Zhang, Q Li, J Zhu, Y Zhong - European Conference on …, 2022 - Springer
Continual learning requires incremental compatibility with a sequence of tasks. However,
the design of model architecture remains an open question: In general, learning all tasks …

Continual vision-language representation learning with off-diagonal information

Z Ni, L Wei, S Tang, Y Zhuang… - … Conference on Machine …, 2023 - proceedings.mlr.press
Large-scale multi-modal contrastive learning frameworks like CLIP typically require a large
amount of image-text samples for training. However, these samples are always collected …

How Efficient Are Today's Continual Learning Algorithms?

MY Harun, J Gallardo, TL Hayes… - Proceedings of the …, 2023 - openaccess.thecvf.com
Supervised Continual learning involves updating a deep neural network (DNN) from an ever-
growing stream of labeled data. While most work has focused on overcoming catastrophic …

Continual learning: Applications and the road forward

E Verwimp, S Ben-David, M Bethge, A Cossu… - arXiv preprint arXiv …, 2023 - arxiv.org
Continual learning is a sub-field of machine learning, which aims to allow machine learning
models to continuously learn on new data, by accumulating knowledge without forgetting …

Loss decoupling for task-agnostic continual learning

YS Liang, WJ Li - Advances in Neural Information …, 2024 - proceedings.neurips.cc
Continual learning requires the model to learn multiple tasks in a sequential order. To
perform continual learning, the model must possess the abilities to maintain performance on …

Metazscil: A meta-learning approach for generalized zero-shot class incremental learning

Y Wu, T Liang, S Feng, Y Jin, G Lyu, H Fei… - Proceedings of the AAAI …, 2023 - ojs.aaai.org
Generalized zero-shot learning (GZSL) aims to recognize samples whose categories may
not have been seen at training. Standard GZSL cannot handle dynamic addition of new …

Contintin: Continual learning from task instructions

W Yin, J Li, C Xiong - arXiv preprint arXiv:2203.08512, 2022 - arxiv.org
The mainstream machine learning paradigms for NLP often work with two underlying
presumptions. First, the target task is predefined and static; a system merely needs to learn …

SATS: Self-attention transfer for continual semantic segmentation

Y Qiu, Y Shen, Z Sun, Y Zheng, X Chang, W Zheng… - Pattern Recognition, 2023 - Elsevier
Continually learning to segment more and more types of image regions is a desired
capability for many intelligent systems. However, such continual semantic segmentation …

Pilot: A pre-trained model-based continual learning toolbox

HL Sun, DW Zhou, HJ Ye, DC Zhan - arXiv preprint arXiv:2309.07117, 2023 - arxiv.org
While traditional machine learning can effectively tackle a wide range of problems, it
primarily operates within a closed-world setting, which presents limitations when dealing …

Towards realistic evaluation of industrial continual learning scenarios with an emphasis on energy consumption and computational footprint

V Chavan, P Koch, M Schlüter… - Proceedings of the …, 2023 - openaccess.thecvf.com
Incremental Learning (IL) aims to develop Machine Learning (ML) models that can learn
from continuous streams of data and mitigate catastrophic forgetting. We analyse the current …