A comprehensive survey of continual learning: theory, method and application

L Wang, X Zhang, H Su, J Zhu - IEEE Transactions on Pattern …, 2024 - ieeexplore.ieee.org
To cope with real-world dynamics, an intelligent system needs to incrementally acquire,
update, accumulate, and exploit knowledge throughout its lifetime. This ability, known as …

A comprehensive study of class incremental learning algorithms for visual tasks

E Belouadah, A Popescu, I Kanellos - Neural Networks, 2021 - Elsevier
The ability of artificial agents to increment their capabilities when confronted with new data is
an open challenge in artificial intelligence. The main challenge faced in such cases is …

Dataset distillation via factorization

S Liu, K Wang, X Yang, J Ye… - Advances in neural …, 2022 - proceedings.neurips.cc
In this paper, we study dataset distillation (DD), from a novel perspective and introduce
a\emph {dataset factorization} approach, termed\emph {HaBa}, which is a plug-and-play …

Cafe: Learning to condense dataset by aligning features

K Wang, B Zhao, X Peng, Z Zhu… - Proceedings of the …, 2022 - openaccess.thecvf.com
Dataset condensation aims at reducing the network training effort through condensing a
cumbersome training set into a compact synthetic one. State-of-the-art approaches largely …

Dataset condensation with distribution matching

B Zhao, H Bilen - Proceedings of the IEEE/CVF Winter …, 2023 - openaccess.thecvf.com
Computational cost of training state-of-the-art deep models in many learning problems is
rapidly increasing due to more sophisticated models and larger datasets. A recent promising …

Online continual learning in image classification: An empirical survey

Z Mai, R Li, J Jeong, D Quispe, H Kim, S Sanner - Neurocomputing, 2022 - Elsevier
Online continual learning for image classification studies the problem of learning to classify
images from an online stream of data and tasks, where tasks may include new classes …

Dataset condensation with differentiable siamese augmentation

B Zhao, H Bilen - International Conference on Machine …, 2021 - proceedings.mlr.press
In many machine learning problems, large-scale datasets have become the de-facto
standard to train state-of-the-art deep networks at the price of heavy computation load. In this …

Dataset condensation with gradient matching

B Zhao, KR Mopuri, H Bilen - arXiv preprint arXiv:2006.05929, 2020 - arxiv.org
As the state-of-the-art machine learning methods in many fields rely on larger datasets,
storing datasets and training models on them become significantly more expensive. This …

Efficient dataset distillation using random feature approximation

N Loo, R Hasani, A Amini… - Advances in Neural …, 2022 - proceedings.neurips.cc
Dataset distillation compresses large datasets into smaller synthetic coresets which retain
performance with the aim of reducing the storage and computational burden of processing …

Plop: Learning without forgetting for continual semantic segmentation

A Douillard, Y Chen, A Dapogny… - Proceedings of the …, 2021 - openaccess.thecvf.com
Deep learning approaches are nowadays ubiquitously used to tackle computer vision tasks
such as semantic segmentation, requiring large datasets and substantial computational …