The ability to learn new concepts continually is necessary in this ever-changing world. However, deep neural networks suffer from catastrophic forgetting when learning new …
YM Tang, YX Peng, WS Zheng - Proceedings of the IEEE …, 2022 - openaccess.thecvf.com
Deep neural network (DNN) suffers from catastrophic forgetting when learning incrementally, which greatly limits its applications. Although maintaining a handful of …
G Petit, A Popescu, H Schindler… - Proceedings of the …, 2023 - openaccess.thecvf.com
Exemplar-free class-incremental learning is very challenging due to the negative effect of catastrophic forgetting. A balance between stability and plasticity of the incremental process …
JT Zhai, X Liu, AD Bagdanov, K Li… - Proceedings of the …, 2023 - openaccess.thecvf.com
Abstract Class Incremental Learning (CIL) aims to sequentially learn new classes while avoiding catastrophic forgetting of previous knowledge. We propose to use Masked …
In class-incremental learning, a learning agent faces a stream of data with the goal of learning new classes while not forgetting previous ones. Neural networks are known to …
We propose a causal framework to explain the catastrophic forgetting in Class-Incremental Learning (CIL) and then derive a novel distillation method that is orthogonal to the existing …
G Wu, S Gong, P Li - Proceedings of the IEEE/CVF …, 2021 - openaccess.thecvf.com
Class-incremental learning (CIL) aims at continuously updating a trained model with new classes (plasticity) without forgetting previously learned old ones (stability). Contemporary …
We introduce an approach for incremental learning that preserves feature descriptors of training images from previously learned classes, instead of the images themselves, unlike …
K Zhu, K Zheng, R Feng, D Zhao… - Proceedings of the …, 2023 - openaccess.thecvf.com
Non-exemplar class-incremental learning aims to recognize both the old and new classes without access to old class samples. The conflict between old and new class optimization is …