Deep models, eg, CNNs and Vision Transformers, have achieved impressive achievements in many vision tasks in the closed world. However, novel classes emerge from time to time in …
MD McDonnell, D Gong, A Parvaneh… - Advances in …, 2024 - proceedings.neurips.cc
Continual learning (CL) aims to incrementally learn different tasks (such as classification) in a non-stationary data stream without forgetting old ones. Most CL works focus on tackling …
Foundation models, including Vision Language Models (VLMs) and Large Language Models (LLMs), possess the $ generality $ to handle diverse distributions and tasks, which …
Y Ge, Y Li, S Ni, J Zhao… - Proceedings of the …, 2023 - openaccess.thecvf.com
Continual learning aims to emulate the human ability to continually accumulate knowledge over sequential tasks. The main challenge is to maintain performance on previously learned …
H Zhang, Y Wu, D Li, S Yang, R Zhao, Y Jiang… - arXiv preprint arXiv …, 2024 - arxiv.org
Aligned Large Language Models (LLMs) showcase remarkable versatility, capable of handling diverse real-world tasks. Meanwhile, aligned LLMs are also expected to exhibit …
Representation forgetting refers to the drift of contextualized representations during continual training. Intuitively, the representation forgetting can influence the general …
Pre-trained language models (PLMs) have become a prevalent technique in deep learning for code, utilizing a two-stage pre-training and fine-tuning procedure to acquire general …
Continual learning requires a model to adapt to ongoing changes in the data distribution, and often to the set of tasks to be performed. It is rare, however, that the data and task …
S Chen, M Zhang, J Zhang, K Huang - European Conference on Computer …, 2025 - Springer
Continual Learning (CL) requires model to retain previously learned knowledge while learning new tasks. Recently, experience replay-based methods have made significant …