Deep neural networks (DNNs) often suffer from" catastrophic forgetting" during incremental learning (IL)---an abrupt degradation of performance on the original set of classes when the …
Y Wu, Y Chen, L Wang, Y Ye, Z Liu… - Proceedings of the …, 2019 - openaccess.thecvf.com
Modern machine learning suffers from catastrophic forgetting when learning new classes incrementally. The performance dramatically degrades due to the missing data of old …
Contemporary neural networks are limited in their ability to learn from evolving streams of training data. When trained sequentially on new or evolving tasks, their accuracy drops …
Conventionally, deep neural networks are trained offline, relying on a large dataset prepared in advance. This paradigm is often challenged in real-world applications, eg online …
Abstract Class Incremental Learning (CIL) aims at learning a classifier in a phase-by-phase manner, in which only data of a subset of the classes are provided at each phase. Previous …
M Kang, J Park, B Han - … of the IEEE/CVF conference on …, 2022 - openaccess.thecvf.com
We present a novel class incremental learning approach based on deep neural networks, which continually learns new tasks with limited memory for storing examples in the previous …
A well-known issue for class-incremental learning is the catastrophic forgetting phenomenon, where the network's recognition performance on old classes degrades …
J He, R Mao, Z Shao, F Zhu - Proceedings of the IEEE/CVF …, 2020 - openaccess.thecvf.com
Modern deep learning approaches have achieved great success in many vision applications by training a model using all available task-specific data. However, there are two major …
X Chen, X Chang - Proceedings of the IEEE/CVF …, 2023 - openaccess.thecvf.com
The rehearsal strategy is widely used to alleviate the catastrophic forgetting problem in class incremental learning (CIL) by preserving limited exemplars from previous tasks. With …