A comprehensive survey on deep graph representation learning

W Ju, Z Fang, Y Gu, Z Liu, Q Long, Z Qiao, Y Qin… - Neural Networks, 2024 - Elsevier
Graph representation learning aims to effectively encode high-dimensional sparse graph-
structured data into low-dimensional dense vectors, which is a fundamental task that has …

Transformer for graphs: An overview from architecture perspective

E Min, R Chen, Y Bian, T Xu, K Zhao, W Huang… - arXiv preprint arXiv …, 2022 - arxiv.org
Recently, Transformer model, which has achieved great success in many artificial
intelligence fields, has demonstrated its great potential in modeling graph-structured data …

Recipe for a general, powerful, scalable graph transformer

L Rampášek, M Galkin, VP Dwivedi… - Advances in …, 2022 - proceedings.neurips.cc
We propose a recipe on how to build a general, powerful, scalable (GPS) graph Transformer
with linear complexity and state-of-the-art results on a diverse set of benchmarks. Graph …

Structure-aware transformer for graph representation learning

D Chen, L O'Bray, K Borgwardt - … Conference on Machine …, 2022 - proceedings.mlr.press
The Transformer architecture has gained growing attention in graph representation learning
recently, as it naturally overcomes several limitations of graph neural networks (GNNs) by …

A generalization of vit/mlp-mixer to graphs

X He, B Hooi, T Laurent, A Perold… - International …, 2023 - proceedings.mlr.press
Abstract Graph Neural Networks (GNNs) have shown great potential in the field of graph
representation learning. Standard GNNs define a local message-passing mechanism which …

Attending to graph transformers

L Müller, M Galkin, C Morris, L Rampášek - arXiv preprint arXiv …, 2023 - arxiv.org
Recently, transformer architectures for graphs emerged as an alternative to established
techniques for machine learning with graphs, such as (message-passing) graph neural …

NAGphormer: A tokenized graph transformer for node classification in large graphs

J Chen, K Gao, G Li, K He - arXiv preprint arXiv:2206.04910, 2022 - arxiv.org
The graph Transformer emerges as a new architecture and has shown superior
performance on various graph mining tasks. In this work, we observe that existing graph …

A knowledge-guided pre-training framework for improving molecular representation learning

H Li, R Zhang, Y Min, D Ma, D Zhao, J Zeng - Nature Communications, 2023 - nature.com
Learning effective molecular feature representation to facilitate molecular property prediction
is of great significance for drug discovery. Recently, there has been a surge of interest in pre …

On the connection between mpnn and graph transformer

C Cai, TS Hy, R Yu, Y Wang - International Conference on …, 2023 - proceedings.mlr.press
Graph Transformer (GT) recently has emerged as a new paradigm of graph learning
algorithms, outperforming the previously popular Message Passing Neural Network (MPNN) …

Simplifying and empowering transformers for large-graph representations

Q Wu, W Zhao, C Yang, H Zhang… - Advances in …, 2024 - proceedings.neurips.cc
Learning representations on large-sized graphs is a long-standing challenge due to the inter-
dependence nature involved in massive data points. Transformers, as an emerging class of …