Z Wang, E Yang, L Shen… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Forgetting refers to the loss or deterioration of previously acquired knowledge. While existing surveys on forgetting have primarily focused on continual learning, forgetting is a …
Continual learning aims to empower artificial intelligence with strong adaptability to the real world. For this purpose, a desirable solution should properly balance memory stability with …
Abstract Inspired by Lottery Ticket Hypothesis that competitive subnetworks exist within a dense network, we propose a continual learning method referred to as Winning …
E Yang, L Shen, Z Wang, S Liu… - Proceedings of the …, 2023 - openaccess.thecvf.com
The goal of continual learning (CL) is to continuously learn new tasks without forgetting previously learned old tasks. To alleviate catastrophic forgetting, gradient projection based …
Contrastive methods have led a recent surge in the performance of self-supervised representation learning (SSL). Recent methods like BYOL or SimSiam purportedly distill …
F Ye, AG Bors - Proceedings of the IEEE/CVF International …, 2023 - openaccess.thecvf.com
Abstract Task-Free Continual Learning (TFCL) aims to learn new concepts from a stream of data without any task information. The Dynamic Expansion Model (DEM) has shown …
It's widely acknowledged that deep learning models with flatter minima in its loss landscape tend to generalize better. However, such property is under-explored in deep long-tailed …
Class-Incremental Learning (CIL) or continual learning is a desired capability in the real world, which requires a learning system to adapt to new tasks without forgetting former ones …
H Shi, H Wang - Advances in Neural Information Processing …, 2024 - proceedings.neurips.cc
Abstract Domain incremental learning aims to adapt to a sequence of domains with access to only a small subset of data (ie, memory) from previous domains. Various methods have …