The ability of artificial agents to increment their capabilities when confronted with new data is an open challenge in artificial intelligence. The main challenge faced in such cases is …
In this paper, we study dataset distillation (DD), from a novel perspective and introduce a\emph {dataset factorization} approach, termed\emph {HaBa}, which is a plug-and-play …
Dataset condensation aims at reducing the network training effort through condensing a cumbersome training set into a compact synthetic one. State-of-the-art approaches largely …
B Zhao, H Bilen - Proceedings of the IEEE/CVF Winter …, 2023 - openaccess.thecvf.com
Computational cost of training state-of-the-art deep models in many learning problems is rapidly increasing due to more sophisticated models and larger datasets. A recent promising …
Online continual learning for image classification studies the problem of learning to classify images from an online stream of data and tasks, where tasks may include new classes …
B Zhao, H Bilen - International Conference on Machine …, 2021 - proceedings.mlr.press
In many machine learning problems, large-scale datasets have become the de-facto standard to train state-of-the-art deep networks at the price of heavy computation load. In this …
As the state-of-the-art machine learning methods in many fields rely on larger datasets, storing datasets and training models on them become significantly more expensive. This …
Dataset distillation compresses large datasets into smaller synthetic coresets which retain performance with the aim of reducing the storage and computational burden of processing …
A Douillard, Y Chen, A Dapogny… - Proceedings of the …, 2021 - openaccess.thecvf.com
Deep learning approaches are nowadays ubiquitously used to tackle computer vision tasks such as semantic segmentation, requiring large datasets and substantial computational …