X Zhang, J Du, Y Li, W Xie… - Proceedings of the IEEE …, 2024 - openaccess.thecvf.com
Dataset pruning aims to construct a coreset capable of achieving performance comparable to the original full dataset. Most existing dataset pruning methods rely on snapshot-based …
S Khaki, A Sajedi, K Wang, LZ Liu… - Proceedings of the …, 2024 - openaccess.thecvf.com
Recent works in dataset distillation seek to minimize training expenses by generating a condensed synthetic dataset that encapsulates the information present in a larger real …
Y Zhao, X Deng, X Su, H Xu, X Li, Y Liu… - arXiv preprint arXiv …, 2024 - arxiv.org
Dataset distillation (DD) entails creating a refined, compact distilled dataset from a large- scale dataset to facilitate efficient training. A significant challenge in DD is the dependency …
L Xiao, Y He - arXiv preprint arXiv:2410.15919, 2024 - arxiv.org
In ImageNet-condensation, the storage for auxiliary soft labels exceeds that of the condensed dataset by over 30 times. However, are large-scale soft labels necessary for …
Knowledge distillation (KD) is a key element in neural network compression that allows knowledge transfer from a pre-trained teacher model to a more compact student model. KD …
This paper improves upon existing data pruning methods for image classification by introducing a novel pruning metric and pruning procedure based on importance sampling …