Self-supervised Dataset Distillation: A Good Compression Is All You Need

M Zhou, Z Yin, S Shao, Z Shen - arXiv preprint arXiv:2404.07976, 2024 - arxiv.org
Dataset distillation aims to compress information from a large-scale original dataset to a new
compact dataset while striving to preserve the utmost degree of the original data …

Breaking Class Barriers: Efficient Dataset Distillation via Inter-Class Feature Compensator

X Zhang, J Du, P Liu, JT Zhou - arXiv preprint arXiv:2408.06927, 2024 - arxiv.org
Dataset distillation has emerged as a technique aiming to condense informative features
from large, natural datasets into a compact and synthetic form. While recent advancements …

BACON: Bayesian Optimal Condensation Framework for Dataset Distillation

Z Zhou, H Zhao, G Cheng, X Li, S Lyu, W Feng… - arXiv preprint arXiv …, 2024 - arxiv.org
Dataset Distillation (DD) aims to distill knowledge from extensive datasets into more
compact ones while preserving performance on the test set, thereby reducing storage costs …