Graph neural networks for materials science and chemistry

P Reiser, M Neubert, A Eberhard, L Torresi… - Communications …, 2022 - nature.com
Abstract Machine learning plays an increasingly important role in many areas of chemistry
and materials science, being used to predict materials properties, accelerate simulations …

Everything is connected: Graph neural networks

P Veličković - Current Opinion in Structural Biology, 2023 - Elsevier
In many ways, graphs are the main modality of data we receive from nature. This is due to
the fact that most of the patterns we see, both in natural and artificial systems, are elegantly …

Recipe for a general, powerful, scalable graph transformer

L Rampášek, M Galkin, VP Dwivedi… - Advances in …, 2022 - proceedings.neurips.cc
We propose a recipe on how to build a general, powerful, scalable (GPS) graph Transformer
with linear complexity and state-of-the-art results on a diverse set of benchmarks. Graph …

Long range graph benchmark

VP Dwivedi, L Rampášek, M Galkin… - Advances in …, 2022 - proceedings.neurips.cc
Abstract Graph Neural Networks (GNNs) that are based on the message passing (MP)
paradigm generally exchange information between 1-hop neighbors to build node …

Nodeformer: A scalable graph structure learning transformer for node classification

Q Wu, W Zhao, Z Li, DP Wipf… - Advances in Neural …, 2022 - proceedings.neurips.cc
Graph neural networks have been extensively studied for learning with inter-connected data.
Despite this, recent evidence has revealed GNNs' deficiencies related to over-squashing …

Structure-aware transformer for graph representation learning

D Chen, L O'Bray, K Borgwardt - … Conference on Machine …, 2022 - proceedings.mlr.press
The Transformer architecture has gained growing attention in graph representation learning
recently, as it naturally overcomes several limitations of graph neural networks (GNNs) by …

How attentive are graph attention networks?

S Brody, U Alon, E Yahav - arXiv preprint arXiv:2105.14491, 2021 - arxiv.org
Graph Attention Networks (GATs) are one of the most popular GNN architectures and are
considered as the state-of-the-art architecture for representation learning with graphs. In …

Data augmentation for deep graph learning: A survey

K Ding, Z Xu, H Tong, H Liu - ACM SIGKDD Explorations Newsletter, 2022 - dl.acm.org
Graph neural networks, a powerful deep learning tool to model graph-structured data, have
demonstrated remarkable performance on numerous graph learning tasks. To address the …

Understanding over-squashing and bottlenecks on graphs via curvature

J Topping, F Di Giovanni, BP Chamberlain… - arXiv preprint arXiv …, 2021 - arxiv.org
Most graph neural networks (GNNs) use the message passing paradigm, in which node
features are propagated on the input graph. Recent works pointed to the distortion of …

On over-squashing in message passing neural networks: The impact of width, depth, and topology

F Di Giovanni, L Giusti, F Barbero… - International …, 2023 - proceedings.mlr.press
Abstract Message Passing Neural Networks (MPNNs) are instances of Graph Neural
Networks that leverage the graph to send messages over the edges. This inductive bias …