The power of DNNs relies heavily on the quantity and quality of training data. However, collecting and annotating data on a large scale is often expensive and time-consuming. To …
The ultimate goal of Dataset Distillation is to synthesize a small synthetic dataset such that a model trained on this synthetic set will perform equally well as a model trained on the full …
Y Qin, Y Yang, P Guo, G Li, H Shao, Y Shi, Z Xu… - arXiv preprint arXiv …, 2024 - arxiv.org
Instruction tuning plays a critical role in aligning large language models (LLMs) with human preference. Despite the vast amount of open instruction datasets, naively training a LLM on …
M Yin, H Wang, W Guo, Y Liu, S Zhang… - Proceedings of the 30th …, 2024 - dl.acm.org
The sequential recommender (SR) system is a crucial component of modern recommender systems, as it aims to capture the evolving preferences of users. Significant efforts have …
X Zhang, J Du, Y Li, W Xie… - Proceedings of the IEEE …, 2024 - openaccess.thecvf.com
Dataset pruning aims to construct a coreset capable of achieving performance comparable to the original full dataset. Most existing dataset pruning methods rely on snapshot-based …
H Zhang, S Li, P Wang, D Zeng, S Ge - Proceedings of the AAAI …, 2024 - ojs.aaai.org
Training state-of-the-art (SOTA) deep models often requires extensive data, resulting in substantial training and storage costs. To address these challenges, dataset condensation …
In recent years there has been significant progress in the development of text-to-image generative models. Evaluating the quality of the generative models is one essential step in …
Graph condensation aims to reduce the size of a large-scale graph dataset by synthesizing a compact counterpart without sacrificing the performance of Graph Neural Networks …
Y Lu, X Chen, Y Zhang, J Gu, T Zhang, Y Zhang… - arXiv preprint arXiv …, 2023 - arxiv.org
Dataset Distillation (DD) is a prominent technique that encapsulates knowledge from a large- scale original dataset into a small synthetic dataset for efficient training. Meanwhile, Pre …