Cnll: A semi-supervised approach for continual noisy label learning

N Karim, U Khalid, A Esmaeili… - Proceedings of the …, 2022 - openaccess.thecvf.com
The task of continual learning requires careful design of algorithms that can tackle
catastrophic forgetting. However, the noisy label, which is inevitable in a real-world scenario …

How Efficient Are Today's Continual Learning Algorithms?

MY Harun, J Gallardo, TL Hayes… - Proceedings of the …, 2023 - openaccess.thecvf.com
Supervised Continual learning involves updating a deep neural network (DNN) from an ever-
growing stream of labeled data. While most work has focused on overcoming catastrophic …

Continual learning on noisy data streams via self-purified replay

CD Kim, J Jeong, S Moon… - Proceedings of the IEEE …, 2021 - openaccess.thecvf.com
Continually learning in the real world must overcome many challenges, among which noisy
labels are a common and inevitable issue. In this work, we present a replay-based continual …

Ordisco: Effective and efficient usage of incremental unlabeled data for semi-supervised continual learning

L Wang, K Yang, C Li, L Hong… - Proceedings of the …, 2021 - openaccess.thecvf.com
Continual learning usually assumes the incoming data are fully labeled, which might not be
applicable in real applications. In this work, we consider semi-supervised continual learning …

Kaizen: Practical self-supervised continual learning with continual fine-tuning

CI Tang, L Qendro, D Spathis… - Proceedings of the …, 2024 - openaccess.thecvf.com
Self-supervised learning (SSL) has shown remarkable performance in computer vision tasks
when trained offline. However, in a Continual Learning (CL) scenario where new data is …

Online continual learning on a contaminated data stream with blurry task boundaries

J Bang, H Koh, S Park, H Song… - Proceedings of the …, 2022 - openaccess.thecvf.com
Learning under a continuously changing data distribution with incorrect labels is a desirable
real-world problem yet challenging. Large body of continual learning (CL) methods …

Continual learning by asymmetric loss approximation with single-side overestimation

D Park, S Hong, B Han, KM Lee - Proceedings of the IEEE …, 2019 - openaccess.thecvf.com
Catastrophic forgetting is a critical challenge in training deep neural networks. Although
continual learning has been investigated as a countermeasure to the problem, it often …

On generalizing beyond domains in cross-domain continual learning

C Simon, M Faraki, YH Tsai, X Yu… - Proceedings of the …, 2022 - openaccess.thecvf.com
In the real world, humans have the ability to accumulate new knowledge in any conditions.
However, deeplearning suffers from the phenomenon so-called catastrophic forgetting of the …

Learning to prompt for continual learning

Z Wang, Z Zhang, CY Lee, H Zhang… - Proceedings of the …, 2022 - openaccess.thecvf.com
The mainstream paradigm behind continual learning has been to adapt the model
parameters to non-stationary data distributions, where catastrophic forgetting is the central …

A closer look at rehearsal-free continual learning

JS Smith, J Tian, S Halbe, YC Hsu… - Proceedings of the …, 2023 - openaccess.thecvf.com
Continual learning is a setting where machine learning models learn novel concepts from
continuously shifting training data, while simultaneously avoiding degradation of knowledge …