Theory on forgetting and generalization of continual learning

S Lin, P Ju, Y Liang, N Shroff - International Conference on …, 2023 - proceedings.mlr.press
Continual learning (CL), which aims to learn a sequence of tasks, has attracted significant
recent attention. However, most work has focused on the experimental performance of CL …

Toward understanding catastrophic forgetting in continual learning

CV Nguyen, A Achille, M Lam, T Hassner… - arXiv preprint arXiv …, 2019 - arxiv.org
We study the relationship between catastrophic forgetting and properties of task sequences.
In particular, given a sequence of tasks, we would like to understand which properties of this …

Regularization shortcomings for continual learning

T Lesort, A Stoian, D Filliat - arXiv preprint arXiv:1912.03049, 2019 - arxiv.org
In most machine learning algorithms, training data is assumed to be independent and
identically distributed (iid). When it is not the case, the algorithm's performances are …

Probing representation forgetting in supervised and unsupervised continual learning

MR Davari, N Asadi, S Mudur… - Proceedings of the …, 2022 - openaccess.thecvf.com
Continual Learning (CL) research typically focuses on tackling the phenomenon of
catastrophic forgetting in neural networks. Catastrophic forgetting is associated with an …

Does continual learning equally forget all parameters?

H Zhao, T Zhou, G Long, J Jiang… - … on Machine Learning, 2023 - proceedings.mlr.press
Distribution shift (eg, task or domain shift) in continual learning (CL) usually results in
catastrophic forgetting of previously learned knowledge. Although it can be alleviated by …

Architecture matters in continual learning

SI Mirzadeh, A Chaudhry, D Yin, T Nguyen… - arXiv preprint arXiv …, 2022 - arxiv.org
A large body of research in continual learning is devoted to overcoming the catastrophic
forgetting of neural networks by designing new algorithms that are robust to the distribution …

A neural dirichlet process mixture model for task-free continual learning

S Lee, J Ha, D Zhang, G Kim - arXiv preprint arXiv:2001.00689, 2020 - arxiv.org
Despite the growing interest in continual learning, most of its contemporary works have been
studied in a rather restricted setting where tasks are clearly distinguishable, and task …

An investigation of replay-based approaches for continual learning

B Bagus, A Gepperth - 2021 International Joint Conference on …, 2021 - ieeexplore.ieee.org
Continual learning (CL) is a major challenge of machine learning (ML) and describes the
ability to learn several tasks sequentially without catastrophic forgetting (CF). Recent works …

Three scenarios for continual learning

GM Van de Ven, AS Tolias - arXiv preprint arXiv:1904.07734, 2019 - arxiv.org
Standard artificial neural networks suffer from the well-known issue of catastrophic
forgetting, making continual or lifelong learning difficult for machine learning. In recent years …

Helpful or harmful: Inter-task association in continual learning

H Jin, E Kim - European Conference on Computer Vision, 2022 - Springer
When optimizing sequentially incoming tasks, deep neural networks generally suffer from
catastrophic forgetting due to their lack of ability to maintain knowledge from old tasks. This …