In most machine learning algorithms, training data is assumed to be independent and identically distributed (iid). When it is not the case, the algorithm's performances are …
T Doan, SI Mirzadeh… - Conference on Lifelong …, 2023 - proceedings.mlr.press
A growing body of research in continual learning focuses on the catastrophic forgetting problem. While many attempts have been made to alleviate this problem, the majority of the …
Continual learning (CL), which aims to learn a sequence of tasks, has attracted significant recent attention. However, most work has focused on the experimental performance of CL …
We formulate the continual learning (CL) problem via dynamic programming and model the trade-off between catastrophic forgetting and generalization as a two-player sequential …
The ability of a model to learn continually can be empirically assessed in different continual learning scenarios. Each scenario defines the constraints and the opportunities of the …
Z Chen, B Liu - Lifelong Machine Learning, 2018 - Springer
In the recent years, lifelong learning (LL) has attracted a great deal of attention in the deep learning community, where it is often called continual learning. Though it is well-known that …
Standard artificial neural networks suffer from the well-known issue of catastrophic forgetting, making continual or lifelong learning difficult for machine learning. In recent years …
J Knoblauch, H Husain… - … Conference on Machine …, 2020 - proceedings.mlr.press
Continual Learning (CL) algorithms incrementally learn a predictor or representation across multiple sequentially observed tasks. Designing CL algorithms that perform reliably and …
We study the relationship between catastrophic forgetting and properties of task sequences. In particular, given a sequence of tasks, we would like to understand which properties of this …