Z Ke, B Liu - arXiv preprint arXiv:2211.12701, 2022 - arxiv.org
Continual learning (CL) is a learning paradigm that emulates the human capability of learning and accumulating knowledge continually without forgetting the previously learned …
Deep models, eg, CNNs and Vision Transformers, have achieved impressive achievements in many vision tasks in the closed world. However, novel classes emerge from time to time in …
Large language models (LLMs) have been transformative. They are pretrained foundational models that are self-supervised and can be adapted with fine-tuning to a wide range of …
R Gao, W Liu - International Conference on Machine …, 2023 - proceedings.mlr.press
Popular deep-learning models in the field of image classification suffer from catastrophic forgetting—models will forget previously acquired skills when learning new ones …
Abstract The field of Continual Learning investigates the ability to learn consecutive tasks without losing performance on those previously learned. The efforts of researchers have …
Real-time on-device continual learning is needed for new applications such as home robots, user personalization on smartphones, and augmented/virtual reality headsets. However, this …
Lifelong learning—an agent's ability to learn throughout its lifetime—is a hallmark of biological learning systems and a central challenge for artificial intelligence (AI). The …
Artificial neural networks are known to suffer from catastrophic forgetting: when learning multiple tasks sequentially, they perform well on the most recent task at the expense of …
In continual learning, a system must incrementally learn from a non-stationary data stream without catastrophic forgetting. Recently, multiple methods have been devised for …