Efficient continual learning with modular networks and task-driven priors

T Veniat, L Denoyer, MA Ranzato - arXiv preprint arXiv:2012.12631, 2020 - arxiv.org
Existing literature in Continual Learning (CL) has focused on overcoming catastrophic
forgetting, the inability of the learner to recall how to perform tasks observed in the past …

Achieving forgetting prevention and knowledge transfer in continual learning

Z Ke, B Liu, N Ma, H Xu, L Shu - Advances in Neural …, 2021 - proceedings.neurips.cc
Continual learning (CL) learns a sequence of tasks incrementally with the goal of achieving
two main objectives: overcoming catastrophic forgetting (CF) and encouraging knowledge …

Architecture matters in continual learning

SI Mirzadeh, A Chaudhry, D Yin, T Nguyen… - arXiv preprint arXiv …, 2022 - arxiv.org
A large body of research in continual learning is devoted to overcoming the catastrophic
forgetting of neural networks by designing new algorithms that are robust to the distribution …

Distilled replay: Overcoming forgetting through synthetic samples

A Rosasco, A Carta, A Cossu, V Lomonaco… - … Workshop on Continual …, 2021 - Springer
Abstract Replay strategies are Continual Learning techniques which mitigate catastrophic
forgetting by keeping a buffer of patterns from previous experiences, which are interleaved …

Online fast adaptation and knowledge accumulation: a new approach to continual learning

M Caccia, P Rodriguez, O Ostapenko… - arXiv preprint arXiv …, 2020 - arxiv.org
Continual learning studies agents that learn from streams of tasks without forgetting previous
ones while adapting to new ones. Two recent continual-learning scenarios have opened …

Probing representation forgetting in supervised and unsupervised continual learning

MR Davari, N Asadi, S Mudur… - Proceedings of the …, 2022 - openaccess.thecvf.com
Continual Learning (CL) research typically focuses on tackling the phenomenon of
catastrophic forgetting in neural networks. Catastrophic forgetting is associated with an …

Bns: Building network structures dynamically for continual learning

Q Qin, W Hu, H Peng, D Zhao… - Advances in Neural …, 2021 - proceedings.neurips.cc
Continual learning (CL) of a sequence of tasks is often accompanied with the catastrophic
forgetting (CF) problem. Existing research has achieved remarkable results in overcoming …

Helpful or harmful: Inter-task association in continual learning

H Jin, E Kim - European Conference on Computer Vision, 2022 - Springer
When optimizing sequentially incoming tasks, deep neural networks generally suffer from
catastrophic forgetting due to their lack of ability to maintain knowledge from old tasks. This …

Adaptive orthogonal projection for batch and online continual learning

Y Guo, W Hu, D Zhao, B Liu - Proceedings of the AAAI Conference on …, 2022 - ojs.aaai.org
Catastrophic forgetting is a key obstacle to continual learning. One of the state-of-the-art
approaches is orthogonal projection. The idea of this approach is to learn each task by …

Is forgetting less a good inductive bias for forward transfer?

J Chen, T Nguyen, D Gorur, A Chaudhry - arXiv preprint arXiv:2303.08207, 2023 - arxiv.org
One of the main motivations of studying continual learning is that the problem setting allows
a model to accrue knowledge from past tasks to learn new tasks more efficiently. However …