Transformer for graphs: An overview from architecture perspective

E Min, R Chen, Y Bian, T Xu, K Zhao, W Huang… - arXiv preprint arXiv …, 2022 - arxiv.org
Recently, Transformer model, which has achieved great success in many artificial
intelligence fields, has demonstrated its great potential in modeling graph-structured data …

[HTML][HTML] Green learning: Introduction, examples and outlook

CCJ Kuo, AM Madni - Journal of Visual Communication and Image …, 2023 - Elsevier
Rapid advances in artificial intelligence (AI) in the last decade have been largely built upon
the wide applications of deep learning (DL). However, the high carbon footprint yielded by …

Data augmentation for deep graph learning: A survey

K Ding, Z Xu, H Tong, H Liu - ACM SIGKDD Explorations Newsletter, 2022 - dl.acm.org
Graph neural networks, a powerful deep learning tool to model graph-structured data, have
demonstrated remarkable performance on numerous graph learning tasks. To address the …

Nested graph neural networks

M Zhang, P Li - Advances in Neural Information Processing …, 2021 - proceedings.neurips.cc
Graph neural network (GNN)'s success in graph classification is closely related to the
Weisfeiler-Lehman (1-WL) algorithm. By iteratively aggregating neighboring node features …

Weisfeiler and leman go machine learning: The story so far

C Morris, Y Lipman, H Maron, B Rieck… - The Journal of Machine …, 2023 - dl.acm.org
In recent years, algorithms and neural architectures based on the Weisfeiler-Leman
algorithm, a well-known heuristic for the graph isomorphism problem, have emerged as a …

Local augmentation for graph neural networks

S Liu, R Ying, H Dong, L Li, T Xu… - International …, 2022 - proceedings.mlr.press
Abstract Graph Neural Networks (GNNs) have achieved remarkable performance on graph-
based tasks. The key idea for GNNs is to obtain informative representation through …

Gnnautoscale: Scalable and expressive graph neural networks via historical embeddings

M Fey, JE Lenssen, F Weichert… - … on machine learning, 2021 - proceedings.mlr.press
We present GNNAutoScale (GAS), a framework for scaling arbitrary message-passing GNNs
to large graphs. GAS prunes entire sub-trees of the computation graph by utilizing historical …

Graphmae2: A decoding-enhanced masked self-supervised graph learner

Z Hou, Y He, Y Cen, X Liu, Y Dong… - Proceedings of the …, 2023 - dl.acm.org
Graph self-supervised learning (SSL), including contrastive and generative approaches,
offers great potential to address the fundamental challenge of label scarcity in real-world …

GNNLab: a factored system for sample-based GNN training over GPUs

J Yang, D Tang, X Song, L Wang, Q Yin… - Proceedings of the …, 2022 - dl.acm.org
We propose GNNLab, a sample-based GNN training system in a single machine multi-GPU
setup. GNNLab adopts a factored design for multiple GPUs, where each GPU is dedicated to …

Graph pooling for graph neural networks: Progress, challenges, and opportunities

C Liu, Y Zhan, J Wu, C Li, B Du, W Hu, T Liu… - arXiv preprint arXiv …, 2022 - arxiv.org
Graph neural networks have emerged as a leading architecture for many graph-level tasks,
such as graph classification and graph generation. As an essential component of the …