Dataset distillation: A comprehensive review

R Yu, S Liu, X Wang - IEEE Transactions on Pattern Analysis …, 2023 - ieeexplore.ieee.org
Recent success of deep learning is largely attributed to the sheer amount of data used for
training deep neural networks. Despite the unprecedented success, the massive data …

A comprehensive survey of dataset distillation

S Lei, D Tao - IEEE Transactions on Pattern Analysis and …, 2023 - ieeexplore.ieee.org
Deep learning technology has developed unprecedentedly in the last decade and has
become the primary choice in many application domains. This progress is mainly attributed …

Data augmentation for deep graph learning: A survey

K Ding, Z Xu, H Tong, H Liu - ACM SIGKDD Explorations Newsletter, 2022 - dl.acm.org
Graph neural networks, a powerful deep learning tool to model graph-structured data, have
demonstrated remarkable performance on numerous graph learning tasks. To address the …

Does graph distillation see like vision dataset counterpart?

B Yang, K Wang, Q Sun, C Ji, X Fu… - Advances in …, 2024 - proceedings.neurips.cc
Training on large-scale graphs has achieved remarkable results in graph representation
learning, but its cost and storage have attracted increasing concerns. Existing graph …

Graph data augmentation for graph machine learning: A survey

T Zhao, W Jin, Y Liu, Y Wang, G Liu… - arXiv preprint arXiv …, 2022 - arxiv.org
Data augmentation has recently seen increased interest in graph machine learning given its
demonstrated ability to improve model performance and generalization by added training …

Structure-free graph condensation: From large-scale graphs to condensed graph-free data

X Zheng, M Zhang, C Chen… - Advances in …, 2024 - proceedings.neurips.cc
Graph condensation, which reduces the size of a large-scale graph by synthesizing a small-
scale condensed graph as its substitution, has immediate benefits for various graph learning …

Data distillation: A survey

N Sachdeva, J McAuley - arXiv preprint arXiv:2301.04272, 2023 - arxiv.org
The popularity of deep learning has led to the curation of a vast number of massive and
multifarious datasets. Despite having close-to-human performance on individual tasks …

Accelerating dataset distillation via model augmentation

L Zhang, J Zhang, B Lei, S Mukherjee… - Proceedings of the …, 2023 - openaccess.thecvf.com
Dataset Distillation (DD), a newly emerging field, aims at generating much smaller but
efficient synthetic training datasets from large ones. Existing DD methods based on gradient …

Dataset pruning: Reducing training data by examining generalization influence

S Yang, Z Xie, H Peng, M Xu, M Sun, P Li - arXiv preprint arXiv …, 2022 - arxiv.org
The great success of deep learning heavily relies on increasingly larger training data, which
comes at a price of huge computational and infrastructural costs. This poses crucial …

Kernel ridge regression-based graph dataset distillation

Z Xu, Y Chen, M Pan, H Chen, M Das, H Yang… - Proceedings of the 29th …, 2023 - dl.acm.org
The huge volume of emerging graph datasets has become a double-bladed sword for graph
machine learning. On the one hand, it empowers the success of a myriad of graph neural …