Transformer for graphs: An overview from architecture perspective

E Min, R Chen, Y Bian, T Xu, K Zhao, W Huang… - arXiv preprint arXiv …, 2022 - arxiv.org
Recently, Transformer model, which has achieved great success in many artificial
intelligence fields, has demonstrated its great potential in modeling graph-structured data …

Exphormer: Sparse transformers for graphs

H Shirzad, A Velingker… - International …, 2023 - proceedings.mlr.press
Graph transformers have emerged as a promising architecture for a variety of graph learning
and representation tasks. Despite their successes, though, it remains challenging to scale …

Graph representation learning and its applications: a survey

VT Hoang, HJ Jeon, ES You, Y Yoon, S Jung, OJ Lee - Sensors, 2023 - mdpi.com
Graphs are data structures that effectively represent relational data in the real world. Graph
representation learning is a significant task since it could facilitate various downstream …

Hierarchical graph transformer with adaptive node sampling

Z Zhang, Q Liu, Q Hu, CK Lee - Advances in Neural …, 2022 - proceedings.neurips.cc
The Transformer architecture has achieved remarkable success in a number of domains
including natural language processing and computer vision. However, when it comes to …

NAGphormer: A tokenized graph transformer for node classification in large graphs

J Chen, K Gao, G Li, K He - arXiv preprint arXiv:2206.04910, 2022 - arxiv.org
The graph Transformer emerges as a new architecture and has shown superior
performance on various graph mining tasks. In this work, we observe that existing graph …

Simplifying and empowering transformers for large-graph representations

Q Wu, W Zhao, C Yang, H Zhang… - Advances in …, 2024 - proceedings.neurips.cc
Learning representations on large-sized graphs is a long-standing challenge due to the inter-
dependence nature involved in massive data points. Transformers, as an emerging class of …

Graph mamba: Towards learning on graphs with state space models

A Behrouz, F Hashemi - arXiv preprint arXiv:2402.08678, 2024 - arxiv.org
Graph Neural Networks (GNNs) have shown promising potential in graph representation
learning. The majority of GNNs define a local message-passing mechanism, propagating …

GOAT: A global transformer on large-scale graphs

K Kong, J Chen, J Kirchenbauer, R Ni… - International …, 2023 - proceedings.mlr.press
Graph transformers have been competitive on graph classification tasks, but they fail to
outperform Graph Neural Networks (GNNs) on node classification, which is a common task …

[PDF][PDF] Gapformer: Graph Transformer with Graph Pooling for Node Classification.

C Liu, Y Zhan, X Ma, L Ding, D Tao, J Wu, W Hu - IJCAI, 2023 - ijcai.org
Abstract Graph Transformers (GTs) have proved their advantage in graph-level tasks.
However, existing GTs still perform unsatisfactorily on the node classification task due to 1) …

Hinormer: Representation learning on heterogeneous information networks with graph transformer

Q Mao, Z Liu, C Liu, J Sun - Proceedings of the ACM Web Conference …, 2023 - dl.acm.org
Recent studies have highlighted the limitations of message-passing based graph neural
networks (GNNs), eg, limited model expressiveness, over-smoothing, over-squashing, etc …