Distributed graph neural network training: A survey

Y Shao, H Li, X Gu, H Yin, Y Li, X Miao… - ACM Computing …, 2024 - dl.acm.org
Graph neural networks (GNNs) are a type of deep learning models that are trained on
graphs and have been successfully applied in various domains. Despite the effectiveness of …

Bns-gcn: Efficient full-graph training of graph convolutional networks with partition-parallelism and random boundary node sampling

C Wan, Y Li, A Li, NS Kim, Y Lin - Proceedings of Machine …, 2022 - proceedings.mlsys.org
Abstract Graph Convolutional Networks (GCNs) have emerged as the state-of-the-art
method for graph-based learning tasks. However, training GCNs at scale is still challenging …

EXACT: Scalable graph neural networks training via extreme activation compression

Z Liu, K Zhou, F Yang, L Li, R Chen… - … Conference on Learning …, 2021 - openreview.net
Training Graph Neural Networks (GNNs) on large graphs is a fundamental challenge due to
the high memory usage, which is mainly occupied by activations (eg, node embeddings) …

Parallel and distributed graph neural networks: An in-depth concurrency analysis

M Besta, T Hoefler - IEEE Transactions on Pattern Analysis and …, 2024 - ieeexplore.ieee.org
Graph neural networks (GNNs) are among the most powerful tools in deep learning. They
routinely solve complex problems on unstructured networks, such as node classification …

Lazygnn: Large-scale graph neural networks via lazy propagation

R Xue, H Han, MA Torkamani… - … on Machine Learning, 2023 - proceedings.mlr.press
Recent works have demonstrated the benefits of capturing long-distance dependency in
graphs by deeper graph neural networks (GNNs). But deeper GNNs suffer from the long …

GraphFM: Improving large-scale GNN training via feature momentum

H Yu, L Wang, B Wang, M Liu… - … on Machine Learning, 2022 - proceedings.mlr.press
Training of graph neural networks (GNNs) for large-scale node classification is challenging.
A key difficulty lies in obtaining accurate hidden node representations while avoiding the …

Rsc: accelerate graph neural networks training via randomized sparse computations

Z Liu, C Shengyuan, K Zhou, D Zha… - International …, 2023 - proceedings.mlr.press
Training graph neural networks (GNNs) is extremely time consuming because sparse graph-
based operations are hard to be accelerated by community hardware. Prior art successfully …

A Comprehensive Survey of Dynamic Graph Neural Networks: Models, Frameworks, Benchmarks, Experiments and Challenges

ZZ Feng, R Wang, TX Wang, M Song, S Wu… - arXiv preprint arXiv …, 2024 - arxiv.org
Dynamic Graph Neural Networks (GNNs) combine temporal information with GNNs to
capture structural, temporal, and contextual relationships in dynamic graphs simultaneously …

Scalable and efficient full-graph gnn training for large graphs

X Wan, K Xu, X Liao, Y Jin, K Chen, X Jin - Proceedings of the ACM on …, 2023 - dl.acm.org
Graph Neural Networks (GNNs) have emerged as powerful tools to capture structural
information from graph-structured data, achieving state-of-the-art performance on …

Adaptive message quantization and parallelization for distributed full-graph gnn training

B Wan, J Zhao, C Wu - Proceedings of Machine Learning …, 2023 - proceedings.mlsys.org
Distributed full-graph training of Graph Neural Networks (GNNs) over large graphs is
bandwidth-demanding and time-consuming. Frequent exchanges of node features …