SCREAM: Knowledge sharing and compact representation for class incremental learning

Z Feng, M Zhou, Z Gao, A Stefanidis, Z Sui - Information Processing & …, 2024 - Elsevier
Methods based on dynamic structures are effective in addressing catastrophic forgetting on
Class-incremental learning (CIL). However, they often isolate sub-networks and overlook the …

A class-incremental learning method for SAR images based on self-sustainment guidance representation

Q Pan, K Liao, X He, Z Bu, J Huang - Remote Sensing, 2023 - mdpi.com
Existing deep learning algorithms for synthetic aperture radar (SAR) image recognition are
performed with offline data. These methods must use all data to retrain the entire model …

Class-Incremental Learning: A Survey

DW Zhou, QW Wang, ZH Qi, HJ Ye… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Deep models, eg, CNNs and Vision Transformers, have achieved impressive achievements
in many vision tasks in the closed world. However, novel classes emerge from time to time in …

Dual Balanced Class-Incremental Learning With im-Softmax and Angular Rectification

R Zhi, Y Meng, J Hou, J Wan - IEEE Transactions on Neural …, 2024 - ieeexplore.ieee.org
Owing to the superior performances, exemplar-based methods with knowledge distillation
(KD) are widely applied in class incremental learning (CIL). However, it suffers from two …

DyCR: A Dynamic Clustering and Recovering Network for Few-Shot Class-Incremental Learning

Z Pan, X Yu, M Zhang, W Zhang… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Few-shot class-incremental learning (FSCIL) aims to continually learn novel data with
limited samples. One of the major challenges is the catastrophic forgetting problem of old …

Continual Learning With Unknown Task Boundary

X Zhu, J Yi, L Zhang - IEEE Transactions on Neural Networks …, 2024 - ieeexplore.ieee.org
Most existing studies on continual learning (CL) consider the task-based setting, where task
boundaries are known to learners during training. However, they may be impractical for real …

Class Incremental Learning With Deep Contrastive Learning and Attention Distillation

J Zhu, G Luo, B Duan, Y Zhu - IEEE Signal Processing Letters, 2024 - ieeexplore.ieee.org
Class incremental learning can solve the issue of catastrophic forgetting when the trained
model is used to learn a new task in which the model may forget part of previous knowledge …

NLOCL: Noise-Labeled Online Continual Learning

K Cheng, Y Ma, G Wang, L Zong, X Liu - Electronics, 2024 - mdpi.com
Continual learning (CL) from infinite data streams has become a challenge for neural
network models in real-world scenarios. Catastrophic forgetting of previous knowledge …

Incremental Recognition of Multi-Style Tibetan Character Based on Transfer Learning

G Zhao, W Wang, X Wang, X Bao, H Li, M Liu - IEEE Access, 2024 - ieeexplore.ieee.org
Tibetan script possesses a distinctive artistic form of writing, intricate glyph structures, and
diverse stylistic variations. In the task of text recognition, effectively handling the recognition …

Enhancing Consistency and Mitigating Bias: A Data Replay Approach for Incremental Learning

C Wang, J Jiang, X Hu, X Liu, X Ji - arXiv preprint arXiv:2401.06548, 2024 - arxiv.org
Deep learning systems are prone to catastrophic forgetting when learning from a sequence
of tasks, where old data from experienced tasks is unavailable when learning from a new …