Deep learning models tend to forget their earlier knowledge while incrementally learning new tasks. This behavior emerges because the parameter updates optimized for the new …
J He, R Mao, Z Shao, F Zhu - Proceedings of the IEEE/CVF …, 2020 - openaccess.thecvf.com
Modern deep learning approaches have achieved great success in many vision applications by training a model using all available task-specific data. However, there are two major …
G Wu, S Gong, P Li - Proceedings of the IEEE/CVF …, 2021 - openaccess.thecvf.com
Class-incremental learning (CIL) aims at continuously updating a trained model with new classes (plasticity) without forgetting previously learned old ones (stability). Contemporary …
Class-incremental learning (CIL) has been widely studied under the setting of starting from a small number of classes (base classes). Instead, we explore an understudied real-world …
DW Zhou, HL Sun, HJ Ye… - Proceedings of the IEEE …, 2024 - openaccess.thecvf.com
Abstract Class-Incremental Learning (CIL) requires a learning system to continually learn new classes without forgetting. Despite the strong performance of Pre-Trained Models …
Y Wu, Y Chen, L Wang, Y Ye, Z Liu… - Proceedings of the …, 2019 - openaccess.thecvf.com
Modern machine learning suffers from catastrophic forgetting when learning new classes incrementally. The performance dramatically degrades due to the missing data of old …
Conventionally, deep neural networks are trained offline, relying on a large dataset prepared in advance. This paradigm is often challenged in real-world applications, eg online …
Deep learning systems typically suffer from catastrophic forgetting of past knowledge when acquiring new skills continually. In this paper, we emphasize two dilemmas, representation …
Despite the impressive performance in many individual tasks, deep neural networks suffer from catastrophic forgetting when learning new tasks incrementally. Recently, various …