Towards understanding generalization of graph neural networks

H Tang, Y Liu - International Conference on Machine …, 2023 - proceedings.mlr.press
Graph neural networks (GNNs) are widely used in machine learning for graph-structured
data. Even though GNNs have achieved remarkable success in real-world applications …

Approximately equivariant graph networks

N Huang, R Levie, S Villar - Advances in Neural …, 2024 - proceedings.neurips.cc
Graph neural networks (GNNs) are commonly described as being permutation equivariant
with respect to node relabeling in the graph. This symmetry of GNNs is often compared to …

Wl meet vc

C Morris, F Geerts, J Tönshoff… - … Conference on Machine …, 2023 - proceedings.mlr.press
Recently, many works studied the expressive power of graph neural networks (GNNs) by
linking it to the $1 $-dimensional Weisfeiler-Leman algorithm ($1\text {-}\mathsf {WL} $) …

Generalization in graph neural networks: Improved pac-bayesian bounds on graph diffusion

H Ju, D Li, A Sharma, HR Zhang - … Conference on Artificial …, 2023 - proceedings.mlr.press
Graph neural networks are widely used tools for graph prediction tasks. Motivated by their
empirical performance, prior works have developed generalization bounds for graph neural …

Graph convolution network based recommender systems: Learning guarantee and item mixture powered strategy

L Deng, D Lian, C Wu, E Chen - Advances in Neural …, 2022 - proceedings.neurips.cc
Inspired by their powerful representation ability on graph-structured data, Graph Convolution
Networks (GCNs) have been widely applied to recommender systems, and have shown …

What Improves the Generalization of Graph Transformers? A Theoretical Dive into the Self-attention and Positional Encoding

H Li, M Wang, T Ma, S Liu, Z Zhang… - arXiv preprint arXiv …, 2024 - arxiv.org
Graph Transformers, which incorporate self-attention and positional encoding, have recently
emerged as a powerful architecture for various graph learning tasks. Despite their …

Empowering simple graph convolutional networks

L Pasa, N Navarin, W Erb… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Many neural networks for graphs are based on the graph convolution (GC) operator,
proposed more than a decade ago. Since then, many alternative definitions have been …

Weisfeiler-Leman at the margin: When more expressivity matters

BJ Franks, C Morris, A Velingker, F Geerts - arXiv preprint arXiv …, 2024 - arxiv.org
The Weisfeiler-Leman algorithm ($1 $-WL) is a well-studied heuristic for the graph
isomorphism problem. Recently, the algorithm has played a prominent role in understanding …

Information-Theoretic Generalization Bounds for Transductive Learning and its Applications

H Tang, Y Liu - arXiv preprint arXiv:2311.04561, 2023 - arxiv.org
In this paper, we develop data-dependent and algorithm-dependent generalization bounds
for transductive learning algorithms in the context of information theory for the first time. We …

Representation power of graph convolutions: Neural tangent kernel analysis

M Sabanayagam, P Esser, D Ghoshdastidar - 2022 - openreview.net
The fundamental principle of Graph Neural Networks (GNNs) is to exploit the structural
information of the data by aggregating the neighboring nodes using agraph convolution' …