G Zhao, G Li, Y Qin, Y Yu - … of the IEEE/CVF Conference on …, 2023 - openaccess.thecvf.com
Dataset Condensation aims to condense a large dataset into a smaller one while maintaining its ability to train a well-performing model, thus reducing the storage cost and …
As deep learning models and datasets rapidly scale up, model training is extremely time- consuming and resource-costly. Instead of training on the entire dataset, learning with a …
Dataset Condensation is a newly emerging technique aiming at learning a tiny dataset that captures the rich information encoded in the original dataset. As the size of datasets …
S Liu, J Ye, R Yu, X Wang - … of the IEEE/CVF Conference on …, 2023 - openaccess.thecvf.com
Dataset distillation, also known as dataset condensation, aims to compress a large dataset into a compact synthetic one. Existing methods perform dataset condensation by assuming a …
B Zhao, H Bilen - Proceedings of the IEEE/CVF Winter …, 2023 - openaccess.thecvf.com
Computational cost of training state-of-the-art deep models in many learning problems is rapidly increasing due to more sophisticated models and larger datasets. A recent promising …
As the state-of-the-art machine learning methods in many fields rely on larger datasets, storing datasets and training models on them become significantly more expensive. This …
Z Yin, E Xing, Z Shen - Advances in Neural Information …, 2024 - proceedings.neurips.cc
We present a new dataset condensation framework termed Squeeze, Recover and Relabel (SRe $^ 2$ L) that decouples the bilevel optimization of model and synthetic data during …
The great success of machine learning with massive amounts of data comes at a price of huge computation costs and storage for training and tuning. Recent studies on dataset …
Graph condensation, which reduces the size of a large-scale graph by synthesizing a small- scale condensed graph as its substitution, has immediate benefits for various graph learning …