Deep neural networks (DNNs) often suffer from" catastrophic forgetting" during incremental learning (IL)---an abrupt degradation of performance on the original set of classes when the …
In class-incremental learning, a learning agent faces a stream of data with the goal of learning new classes while not forgetting previous ones. Neural networks are known to …
Deep learning models tend to forget their earlier knowledge while incrementally learning new tasks. This behavior emerges because the parameter updates optimized for the new …
In class-incremental learning, the model is expected to learn new classes continually while maintaining knowledge on previous classes. The challenge here lies in preserving the …
K Zhu, W Zhai, Y Cao, J Luo… - Proceedings of the IEEE …, 2022 - openaccess.thecvf.com
Non-exemplar class-incremental learning is to recognize both the old and new classes when old class samples cannot be saved. It is a challenging task since representation …
Despite the impressive performance in many individual tasks, deep neural networks suffer from catastrophic forgetting when learning new tasks incrementally. Recently, various …
G Wu, S Gong, P Li - Proceedings of the IEEE/CVF …, 2021 - openaccess.thecvf.com
Class-incremental learning (CIL) aims at continuously updating a trained model with new classes (plasticity) without forgetting previously learned old ones (stability). Contemporary …
Y Liu, B Schiele, Q Sun - … of the IEEE/CVF conference on …, 2021 - openaccess.thecvf.com
Abstract Class-Incremental Learning (CIL) aims to learn a classification model with the number of classes increasing phase-by-phase. An inherent problem in CIL is the stability …
M Kang, J Park, B Han - … of the IEEE/CVF conference on …, 2022 - openaccess.thecvf.com
We present a novel class incremental learning approach based on deep neural networks, which continually learns new tasks with limited memory for storing examples in the previous …