This work investigates the entanglement between Continual Learning (CL) and Transfer Learning (TL). In particular, we shed light on the widespread application of network …
L Wang, K Yang, C Li, L Hong… - Proceedings of the …, 2021 - openaccess.thecvf.com
Continual learning usually assumes the incoming data are fully labeled, which might not be applicable in real applications. In this work, we consider semi-supervised continual learning …
The staple of human intelligence is the capability of acquiring knowledge in a continuous fashion. In stark contrast, Deep Networks forget catastrophically and, for this reason, the sub …
Q Yan, D Gong, Y Liu… - Proceedings of the …, 2022 - openaccess.thecvf.com
Continual Learning (CL) methods aim to enable machine learning models to learn new tasks without catastrophic forgetting of those that have been previously mastered. Existing …
In continual learning (CL), a learner is faced with a sequence of tasks, arriving one after the other, and the goal is to remember all the tasks once the continual learning experience is …
While recent continual learning methods largely alleviate the catastrophic problem on toy- sized datasets, some issues remain to be tackled to apply them to real-world problem …
S Farquhar, Y Gal - arXiv preprint arXiv:1805.09733, 2018 - arxiv.org
Experiments used in current continual learning research do not faithfully assess fundamental challenges of learning continually. Instead of assessing performance on …
Abstract We introduce Domain-Adaptive Prompt (DAP), a novel method for continual learning using Vision Transformers (ViT). Prompt-based continual learning has recently …
Z Wang, L Shen, L Fang, Q Suo… - … on machine learning, 2022 - proceedings.mlr.press
Task-free continual learning (CL) aims to learn a non-stationary data stream without explicit task definitions and not forget previous knowledge. The widely adopted memory replay …