On the diversity and realism of distilled dataset: An efficient dataset distillation paradigm

P Sun, B Shi, D Yu, T Lin - … of the IEEE/CVF Conference on …, 2024 - openaccess.thecvf.com
Contemporary machine learning which involves training large neural networks on massive
datasets faces significant computational challenges. Dataset distillation as a recent …

Spanning training progress: Temporal dual-depth scoring (tdds) for enhanced dataset pruning

X Zhang, J Du, Y Li, W Xie… - Proceedings of the IEEE …, 2024 - openaccess.thecvf.com
Dataset pruning aims to construct a coreset capable of achieving performance comparable
to the original full dataset. Most existing dataset pruning methods rely on snapshot-based …

ATOM: Attention Mixer for Efficient Dataset Distillation

S Khaki, A Sajedi, K Wang, LZ Liu… - Proceedings of the …, 2024 - openaccess.thecvf.com
Recent works in dataset distillation seek to minimize training expenses by generating a
condensed synthetic dataset that encapsulates the information present in a larger real …

MetaDD: Boosting Dataset Distillation with Neural Network Architecture-Invariant Generalization

Y Zhao, X Deng, X Su, H Xu, X Li, Y Liu… - arXiv preprint arXiv …, 2024 - arxiv.org
Dataset distillation (DD) entails creating a refined, compact distilled dataset from a large-
scale dataset to facilitate efficient training. A significant challenge in DD is the dependency …

Are Large-scale Soft Labels Necessary for Large-scale Dataset Distillation?

L Xiao, Y He - arXiv preprint arXiv:2410.15919, 2024 - arxiv.org
In ImageNet-condensation, the storage for auxiliary soft labels exceeds that of the
condensed dataset by over 30 times. However, are large-scale soft labels necessary for …

Condensed Sample-Guided Model Inversion for Knowledge Distillation

K Binici, S Aggarwal, C Acar, NT Pham… - arXiv preprint arXiv …, 2024 - arxiv.org
Knowledge distillation (KD) is a key element in neural network compression that allows
knowledge transfer from a pre-trained teacher model to a more compact student model. KD …

Data Pruning via Separability, Integrity, and Model Uncertainty-Aware Importance Sampling

S Grosz, R Zhao, R Ranjan, H Wang… - arXiv preprint arXiv …, 2024 - arxiv.org
This paper improves upon existing data pruning methods for image classification by
introducing a novel pruning metric and pruning procedure based on importance sampling …