A comprehensive survey of forgetting in deep learning beyond continual learning

Z Wang, E Yang, L Shen… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Forgetting refers to the loss or deterioration of previously acquired knowledge. While
existing surveys on forgetting have primarily focused on continual learning, forgetting is a …

Revisiting class-incremental learning with pre-trained models: Generalizability and adaptivity are all you need

DW Zhou, ZW Cai, HJ Ye, DC Zhan, Z Liu - arXiv preprint arXiv …, 2023 - arxiv.org
Class-incremental learning (CIL) aims to adapt to emerging new classes without forgetting
old ones. Traditional CIL models are trained from scratch to continually acquire knowledge …

Revisiting class-incremental learning with pre-trained models: Generalizability and adaptivity are all you need

DW Zhou, ZW Cai, HJ Ye, DC Zhan, Z Liu - International Journal of …, 2024 - Springer
Class-incremental learning (CIL) aims to adapt to emerging new classes without forgetting
old ones. Traditional CIL models are trained from scratch to continually acquire knowledge …

Class-incremental learning: A survey

DW Zhou, QW Wang, ZH Qi, HJ Ye… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Deep models, eg, CNNs and Vision Transformers, have achieved impressive achievements
in many vision tasks in the closed world. However, novel classes emerge from time to time in …

Few-shot class-incremental learning by sampling multi-phase tasks

DW Zhou, HJ Ye, L Ma, D Xie, S Pu… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
New classes arise frequently in our ever-changing world, eg, emerging topics in social
media and new types of products in e-commerce. A model should recognize new classes …

Expandable subspace ensemble for pre-trained model-based class-incremental learning

DW Zhou, HL Sun, HJ Ye… - Proceedings of the IEEE …, 2024 - openaccess.thecvf.com
Abstract Class-Incremental Learning (CIL) requires a learning system to continually learn
new classes without forgetting. Despite the strong performance of Pre-Trained Models …

Fecam: Exploiting the heterogeneity of class distributions in exemplar-free continual learning

D Goswami, Y Liu, B Twardowski… - Advances in Neural …, 2024 - proceedings.neurips.cc
Exemplar-free class-incremental learning (CIL) poses several challenges since it prohibits
the rehearsal of data from previous tasks and thus suffers from catastrophic forgetting …

Continuous transfer of neural network representational similarity for incremental learning

S Tian, W Li, X Ning, H Ran, H Qin, P Tiwari - Neurocomputing, 2023 - Elsevier
The incremental learning paradigm in machine learning has consistently been a focus of
academic research. It is similar to the way in which biological systems learn, and reduces …

[PDF][PDF] Beef: Bi-compatible class-incremental learning via energy-based expansion and fusion

FY Wang, DW Zhou, L Liu, HJ Ye, Y Bian… - The eleventh …, 2022 - drive.google.com
Neural networks suffer from catastrophic forgetting when sequentially learning tasks phase-
by-phase, making them inapplicable in dynamically updated systems. Class-incremental …

Catastrophic forgetting in deep learning: A comprehensive taxonomy

EL Aleixo, JG Colonna, M Cristo… - arXiv preprint arXiv …, 2023 - arxiv.org
Deep Learning models have achieved remarkable performance in tasks such as image
classification or generation, often surpassing human accuracy. However, they can struggle …