The heterophilic graph learning handbook: Benchmarks, models, theoretical analysis, applications and challenges

S Luan, C Hua, Q Lu, L Ma, L Wu, X Wang… - arXiv preprint arXiv …, 2024 - arxiv.org
Homophily principle,\ie {} nodes with the same labels or similar attributes are more likely to
be connected, has been commonly believed to be the main reason for the superiority of …

Graph mamba: Towards learning on graphs with state space models

A Behrouz, F Hashemi - Proceedings of the 30th ACM SIGKDD …, 2024 - dl.acm.org
Graph Neural Networks (GNNs) have shown promising potential in graph representation
learning. The majority of GNNs define a local message-passing mechanism, propagating …

A Survey on Learning from Graphs with Heterophily: Recent Advances and Future Directions

C Gong, Y Cheng, J Yu, C Xu, C Shan, S Luo… - arXiv preprint arXiv …, 2024 - arxiv.org
Graphs are structured data that models complex relations between real-world entities.
Heterophilic graphs, where linked nodes are prone to be with different labels or dissimilar …

Ntformer: A composite node tokenized graph transformer for node classification

J Chen, S Jiang, K He - arXiv preprint arXiv:2406.19249, 2024 - arxiv.org
Recently, the emerging graph Transformers have made significant advancements for node
classification on graphs. In most graph Transformers, a crucial step involves transforming the …

NAGphormer+: A Tokenized Graph Transformer With Neighborhood Augmentation for Node Classification in Large Graphs

J Chen, C Liu, K Gao, G Li, K He - IEEE Transactions on Big …, 2024 - ieeexplore.ieee.org
Graph Transformers, emerging as a new architecture for graph representation learning,
suffer from the quadratic complexity and can only handle graphs with at most thousands of …

Exploring sparsity in graph transformers

C Liu, Y Zhan, X Ma, L Ding, D Tao, J Wu, W Hu, B Du - Neural Networks, 2024 - Elsevier
Abstract Graph Transformers (GTs) have achieved impressive results on various graph-
related tasks. However, the huge computational cost of GTs hinders their deployment and …

Graph rewiring and preprocessing for graph neural networks based on effective resistance

X Shen, P Lio, L Yang, R Yuan… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Graph neural networks (GNNs) are powerful models for processing graph data and have
demonstrated state-of-the-art performance on many downstream tasks. However, existing …

Gradformer: Graph Transformer with Exponential Decay

C Liu, Z Yao, Y Zhan, X Ma, S Pan, W Hu - arXiv preprint arXiv:2404.15729, 2024 - arxiv.org
Graph Transformers (GTs) have demonstrated their advantages across a wide range of
tasks. However, the self-attention mechanism in GTs overlooks the graph's inductive biases …

Graphfm: A scalable framework for multi-graph pretraining

D Lachi, M Azabou, V Arora, E Dyer - arXiv preprint arXiv:2407.11907, 2024 - arxiv.org
Graph neural networks are typically trained on individual datasets, often requiring highly
specialized models and extensive hyperparameter tuning. This dataset-specific approach …

Leveraging contrastive learning for enhanced node representations in tokenized graph transformers

J Chen, H Liu, JE Hopcroft, K He - arXiv preprint arXiv:2406.19258, 2024 - arxiv.org
While tokenized graph Transformers have demonstrated strong performance in node
classification tasks, their reliance on a limited subset of nodes with high similarity scores for …