S Lei, D Tao - IEEE Transactions on Pattern Analysis and …, 2023 - ieeexplore.ieee.org
Deep learning technology has developed unprecedentedly in the last decade and has become the primary choice in many application domains. This progress is mainly attributed …
In this paper, we study dataset distillation (DD), from a novel perspective and introduce a\emph {dataset factorization} approach, termed\emph {HaBa}, which is a plug-and-play …
Dataset Distillation aims to distill an entire dataset's knowledge into a few synthetic images. The idea is to synthesize a small number of synthetic data points that, when given to a …
S Liu, J Ye, R Yu, X Wang - … of the IEEE/CVF Conference on …, 2023 - openaccess.thecvf.com
Dataset distillation, also known as dataset condensation, aims to compress a large dataset into a compact synthetic one. Existing methods perform dataset condensation by assuming a …
Dataset Distillation is a newly emerging area that aims to distill large datasets into much smaller and highly informative synthetic ones to accelerate training and reduce storage …
G Zhao, G Li, Y Qin, Y Yu - … of the IEEE/CVF Conference on …, 2023 - openaccess.thecvf.com
Dataset Condensation aims to condense a large dataset into a smaller one while maintaining its ability to train a well-performing model, thus reducing the storage cost and …
Y Liu, J Gu, K Wang, Z Zhu… - Proceedings of the …, 2023 - openaccess.thecvf.com
Dataset distillation aims to synthesize small datasets with little information loss from original large-scale ones for reducing storage and training costs. Recent state-of-the-art methods …
Researchers have long tried to minimize training costs in deep learning while maintaining strong generalization across diverse datasets. Emerging research on dataset distillation …
The popularity of deep learning has led to the curation of a vast number of massive and multifarious datasets. Despite having close-to-human performance on individual tasks …