A comprehensive survey of forgetting in deep learning beyond continual learning

Z Wang, E Yang, L Shen… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Forgetting refers to the loss or deterioration of previously acquired knowledge. While
existing surveys on forgetting have primarily focused on continual learning, forgetting is a …

[HTML][HTML] A survey on few-shot class-incremental learning

S Tian, L Li, W Li, H Ran, X Ning, P Tiwari - Neural Networks, 2024 - Elsevier
Large deep learning models are impressive, but they struggle when real-time data is not
available. Few-shot class-incremental learning (FSCIL) poses a significant challenge for …

Revisiting class-incremental learning with pre-trained models: Generalizability and adaptivity are all you need

DW Zhou, ZW Cai, HJ Ye, DC Zhan, Z Liu - arXiv preprint arXiv …, 2023 - arxiv.org
Class-incremental learning (CIL) aims to adapt to emerging new classes without forgetting
old ones. Traditional CIL models are trained from scratch to continually acquire knowledge …

Revisiting class-incremental learning with pre-trained models: Generalizability and adaptivity are all you need

DW Zhou, ZW Cai, HJ Ye, DC Zhan, Z Liu - International Journal of …, 2024 - Springer
Class-incremental learning (CIL) aims to adapt to emerging new classes without forgetting
old ones. Traditional CIL models are trained from scratch to continually acquire knowledge …

Class-incremental learning: A survey

DW Zhou, QW Wang, ZH Qi, HJ Ye… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Deep models, eg, CNNs and Vision Transformers, have achieved impressive achievements
in many vision tasks in the closed world. However, novel classes emerge from time to time in …

Ranpac: Random projections and pre-trained models for continual learning

MD McDonnell, D Gong, A Parvaneh… - Advances in …, 2024 - proceedings.neurips.cc
Continual learning (CL) aims to incrementally learn different tasks (such as classification) in
a non-stationary data stream without forgetting old ones. Most CL works focus on tackling …

A model or 603 exemplars: Towards memory-efficient class-incremental learning

DW Zhou, QW Wang, HJ Ye, DC Zhan - arXiv preprint arXiv:2205.13218, 2022 - arxiv.org
Real-world applications require the classification model to adapt to new classes without
forgetting old ones. Correspondingly, Class-Incremental Learning (CIL) aims to train a …

Ctp: Towards vision-language continual pretraining via compatible momentum contrast and topology preservation

H Zhu, Y Wei, X Liang, C Zhang… - Proceedings of the …, 2023 - openaccess.thecvf.com
Abstract Vision-Language Pretraining (VLP) has shown impressive results on diverse
downstream tasks by offline training on large-scale datasets. Regarding the growing nature …

Generative Multi-modal Models are Good Class Incremental Learners

X Cao, H Lu, L Huang, X Liu… - Proceedings of the …, 2024 - openaccess.thecvf.com
In class incremental learning (CIL) scenarios the phenomenon of catastrophic forgetting
caused by the classifier's bias towards the current task has long posed a significant …

Expandable subspace ensemble for pre-trained model-based class-incremental learning

DW Zhou, HL Sun, HJ Ye… - Proceedings of the IEEE …, 2024 - openaccess.thecvf.com
Abstract Class-Incremental Learning (CIL) requires a learning system to continually learn
new classes without forgetting. Despite the strong performance of Pre-Trained Models …