Dytox: Transformers for continual learning with dynamic token expansion

A Douillard, A Ramé, G Couairon… - Proceedings of the …, 2022 - openaccess.thecvf.com
Deep network architectures struggle to continually learn new tasks without forgetting the
previous tasks. A recent trend indicates that dynamic architectures based on an expansion …

Layerwise optimization by gradient decomposition for continual learning

S Tang, D Chen, J Zhu, S Yu… - Proceedings of the …, 2021 - openaccess.thecvf.com
Deep neural networks achieve state-of-the-art and sometimes super-human performance
across a variety of domains. However, when learning tasks sequentially, the networks easily …

A closer look at rehearsal-free continual learning

JS Smith, J Tian, S Halbe, YC Hsu… - Proceedings of the …, 2023 - openaccess.thecvf.com
Continual learning is a setting where machine learning models learn novel concepts from
continuously shifting training data, while simultaneously avoiding degradation of knowledge …

Coscl: Cooperation of small continual learners is stronger than a big one

L Wang, X Zhang, Q Li, J Zhu, Y Zhong - European Conference on …, 2022 - Springer
Continual learning requires incremental compatibility with a sequence of tasks. However,
the design of model architecture remains an open question: In general, learning all tasks …

Achieving a better stability-plasticity trade-off via auxiliary networks in continual learning

S Kim, L Noci, A Orvieto… - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
In contrast to the natural capabilities of humans to learn new tasks in a sequential fashion,
neural networks are known to suffer from catastrophic forgetting, where the model's …

Continual learning with filter atom swapping

Z Miao, Z Wang, W Chen, Q Qiu - International Conference on …, 2021 - openreview.net
Continual learning has been widely studied in recent years to resolve the catastrophic
forgetting of deep neural networks. In this paper, we first enforce a low-rank filter subspace …

Continual learning with lifelong vision transformer

Z Wang, L Liu, Y Duan, Y Kong… - Proceedings of the IEEE …, 2022 - openaccess.thecvf.com
Continual learning methods aim at training a neural network from sequential data with
streaming labels, relieving catastrophic forgetting. However, existing methods are based on …

Learning to prompt for continual learning

Z Wang, Z Zhang, CY Lee, H Zhang… - Proceedings of the …, 2022 - openaccess.thecvf.com
The mainstream paradigm behind continual learning has been to adapt the model
parameters to non-stationary data distributions, where catastrophic forgetting is the central …

Training networks in null space of feature covariance for continual learning

S Wang, X Li, J Sun, Z Xu - … of the IEEE/CVF conference on …, 2021 - openaccess.thecvf.com
In the setting of continual learning, a network is trained on a sequence of tasks, and suffers
from catastrophic forgetting. To balance plasticity and stability of network in continual …

Slca: Slow learner with classifier alignment for continual learning on a pre-trained model

G Zhang, L Wang, G Kang… - Proceedings of the …, 2023 - openaccess.thecvf.com
The goal of continual learning is to improve the performance of recognition models in
learning sequentially arrived data. Although most existing works are established on the …