Despite recent success in using the invariance principle for out-of-distribution (OOD) generalization on Euclidean data (eg, images), studies on graph data are still limited …
Designing expressive Graph Neural Networks (GNNs) is a central topic in learning graph- structured data. While numerous approaches have been proposed to improve GNNs in …
Recently, transformer architectures for graphs emerged as an alternative to established techniques for machine learning with graphs, such as (message-passing) graph neural …
Designing machine learning architectures for processing neural networks in their raw weight matrix form is a newly introduced research direction. Unfortunately, the unique symmetry …
Numerous subgraph-enhanced graph neural networks (GNNs) have emerged recently, provably boosting the expressive power of standard (message-passing) GNNs. However …
Abstract Graph Neural Networks (GNN) are inherently limited in their expressive power. Recent seminal works (Xu et al., 2019; Morris et al., 2019b) introduced the Weisfeiler …
An effective aggregation of node features into a graph-level representation via readout functions is an essential step in numerous learning tasks involving graph neural networks …
P Veličković - arXiv preprint arXiv:2202.11097, 2022 - arxiv.org
The message passing framework is the foundation of the immense success enjoyed by graph neural networks (GNNs) in recent years. In spite of its elegance, there exist many …
Filters are fundamental in extracting information from data. For time series and image data that reside on Euclidean domains, filters are the crux of many signal processing and …