Graph condensation aims to reduce the size of a large-scale graph dataset by synthesizing a compact counterpart without sacrificing the performance of Graph Neural Networks …
Dataset Distillation (DD) is a promising technique to synthesize a smaller dataset that preserves essential information from the original dataset. This synthetic dataset can serve as …
Training high-quality deep models necessitates vast amounts of data, resulting in overwhelming computational and memory demands. Recently, data pruning, distillation, and …
M Li, Y Qu, Y Shi - Procedia Computer Science, 2024 - Elsevier
Dataset distillation refers to the process of constructing a smaller dataset based on a larger dataset, so that the training model with the smaller dataset can obtain similar results to the …
J Gu, K Wang, W Jiang, Y You - … of the AAAI Conference on Artificial …, 2024 - ojs.aaai.org
Replay-based methods have proved their effectiveness on online continual learning by rehearsing past samples from an auxiliary memory. With many efforts made on improving …
D Xu, J Chen, Y Lu, T Xia, Q Xuan, W Wang… - arXiv preprint arXiv …, 2024 - arxiv.org
Recently, deep learning technology has been successfully introduced into Automatic Modulation Recognition (AMR) tasks. However, the success of deep learning is all attributed …