A comprehensive survey of continual learning: theory, method and application

L Wang, X Zhang, H Su, J Zhu - IEEE Transactions on Pattern …, 2024 - ieeexplore.ieee.org
To cope with real-world dynamics, an intelligent system needs to incrementally acquire,
update, accumulate, and exploit knowledge throughout its lifetime. This ability, known as …

A comprehensive survey of forgetting in deep learning beyond continual learning

Z Wang, E Yang, L Shen… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Forgetting refers to the loss or deterioration of previously acquired knowledge. While
existing surveys on forgetting have primarily focused on continual learning, forgetting is a …

Incorporating neuro-inspired adaptability for continual learning in artificial intelligence

L Wang, X Zhang, Q Li, M Zhang, H Su, J Zhu… - Nature Machine …, 2023 - nature.com
Continual learning aims to empower artificial intelligence with strong adaptability to the real
world. For this purpose, a desirable solution should properly balance memory stability with …

Forget-free continual learning with winning subnetworks

H Kang, RJL Mina, SRH Madjid… - International …, 2022 - proceedings.mlr.press
Abstract Inspired by Lottery Ticket Hypothesis that competitive subnetworks exist within a
dense network, we propose a continual learning method referred to as Winning …

Data augmented flatness-aware gradient projection for continual learning

E Yang, L Shen, Z Wang, S Liu… - Proceedings of the …, 2023 - openaccess.thecvf.com
The goal of continual learning (CL) is to continuously learn new tasks without forgetting
previously learned old tasks. To alleviate catastrophic forgetting, gradient projection based …

Understanding collapse in non-contrastive siamese representation learning

AC Li, AA Efros, D Pathak - European Conference on Computer Vision, 2022 - Springer
Contrastive methods have led a recent surge in the performance of self-supervised
representation learning (SSL). Recent methods like BYOL or SimSiam purportedly distill …

Self-evolved dynamic expansion model for task-free continual learning

F Ye, AG Bors - Proceedings of the IEEE/CVF International …, 2023 - openaccess.thecvf.com
Abstract Task-Free Continual Learning (TFCL) aims to learn new concepts from a stream of
data without any task information. The Dynamic Expansion Model (DEM) has shown …

Class-conditional sharpness-aware minimization for deep long-tailed recognition

Z Zhou, L Li, P Zhao, PA Heng… - Proceedings of the …, 2023 - openaccess.thecvf.com
It's widely acknowledged that deep learning models with flatter minima in its loss landscape
tend to generalize better. However, such property is under-explored in deep long-tailed …

Learning without forgetting for vision-language models

DW Zhou, Y Zhang, J Ning, HJ Ye, DC Zhan… - arXiv preprint arXiv …, 2023 - arxiv.org
Class-Incremental Learning (CIL) or continual learning is a desired capability in the real
world, which requires a learning system to adapt to new tasks without forgetting former ones …

A unified approach to domain incremental learning with memory: Theory and algorithm

H Shi, H Wang - Advances in Neural Information Processing …, 2024 - proceedings.neurips.cc
Abstract Domain incremental learning aims to adapt to a sequence of domains with access
to only a small subset of data (ie, memory) from previous domains. Various methods have …