Recently, transformer architectures for graphs emerged as an alternative to established techniques for machine learning with graphs, such as (message-passing) graph neural …
C Zhou, X Wang, M Zhang - Advances in Neural …, 2024 - proceedings.neurips.cc
Node-level random walk has been widely used to improve Graph Neural Networks. However, there is limited attention to random walk on edge and, more generally, on $ k …
Dynamic graph neural networks (DyGNNs) currently struggle with handling distribution shifts that are inherent in dynamic graphs. Existing work on DyGNNs with out-of-distribution …
Abstract Graph Transformers (GTs) have proved their advantage in graph-level tasks. However, existing GTs still perform unsatisfactorily on the node classification task due to 1) …
D Bo, X Wang, Y Liu, Y Fang, Y Li, C Shi - arXiv preprint arXiv:2302.05631, 2023 - arxiv.org
Graph neural networks (GNNs) have attracted considerable attention from the research community. It is well established that GNNs are usually roughly divided into spatial and …
Deep learning's performance has been extensively recognized recently. Graph neural networks (GNNs) are designed to deal with graph-structural data that classical deep …
K Lu, Y Yu, H Fei, X Li, Z Yang, Z Guo… - Proceedings of the …, 2024 - ojs.aaai.org
In recent years, spectral graph neural networks, characterized by polynomial filters, have garnered increasing attention and have achieved remarkable performance in tasks such as …
Y Liu, D Bo, C Shi - arXiv preprint arXiv:2310.09202, 2023 - arxiv.org
The increasing amount of graph data places requirements on the efficiency and scalability of graph neural networks (GNNs), despite their effectiveness in various graph-related …