EXACT: Scalable graph neural networks training via extreme activation compression

Z Liu, K Zhou, F Yang, L Li, R Chen… - … Conference on Learning …, 2021 - openreview.net
Training Graph Neural Networks (GNNs) on large graphs is a fundamental challenge due to
the high memory usage, which is mainly occupied by activations (eg, node embeddings) …

What makes graph neural networks miscalibrated?

HHH Hsu, Y Shen, C Tomani… - Advances in Neural …, 2022 - proceedings.neurips.cc
Given the importance of getting calibrated predictions and reliable uncertainty estimations,
various post-hoc calibration methods have been developed for neural networks on standard …

Submix: Learning to mix graph sampling heuristics

S Abu-El-Haija, JV Dillon, B Fatemi… - Uncertainty in …, 2023 - proceedings.mlr.press
Sampling subgraphs for training Graph Neural Networks (GNNs) is receiving much attention
from the GNN community. While a variety of methods have been proposed, each method …

Improving graph neural networks with learnable propagation operators

M Eliasof, L Ruthotto, E Treister - … Conference on Machine …, 2023 - proceedings.mlr.press
Abstract Graph Neural Networks (GNNs) are limited in their propagation operators. In many
cases, these operators often contain non-negative elements only and are shared across …

Tackling over-smoothing for general graph convolutional networks

W Huang, Y Rong, T Xu, F Sun, J Huang - arXiv preprint arXiv:2008.09864, 2020 - arxiv.org
Increasing the depth of GCN, which is expected to permit more expressivity, is shown to
incur performance detriment especially on node classification. The main cause of this lies in …

Learning neural network subspaces

M Wortsman, MC Horton, C Guestrin… - International …, 2021 - proceedings.mlr.press
Recent observations have advanced our understanding of the neural network optimization
landscape, revealing the existence of (1) paths of high accuracy containing diverse solutions …

Grand: Graph neural diffusion

B Chamberlain, J Rowbottom… - International …, 2021 - proceedings.mlr.press
Abstract We present Graph Neural Diffusion (GRAND) that approaches deep learning on
graphs as a continuous diffusion process and treats Graph Neural Networks (GNNs) as …

Certified edge unlearning for graph neural networks

K Wu, J Shen, Y Ning, T Wang, WH Wang - Proceedings of the 29th ACM …, 2023 - dl.acm.org
The emergence of evolving data privacy policies and regulations has sparked a growing
interest in the concept of" machine unlearning", which involves enabling machine learning …

Generalization guarantees for neural networks via harnessing the low-rank structure of the jacobian

S Oymak, Z Fabian, M Li, M Soltanolkotabi - arXiv preprint arXiv …, 2019 - arxiv.org
Modern neural network architectures often generalize well despite containing many more
parameters than the size of the training dataset. This paper explores the generalization …

Analyzing the expressive power of graph neural networks in a spectral perspective

M Balcilar, G Renton, P Héroux, B Gaüzère… - International …, 2021 - openreview.net
In the recent literature of Graph Neural Networks (GNN), the expressive power of models
has been studied through their capability to distinguish if two given graphs are isomorphic or …