Unleashing the power of graph data augmentation on covariate distribution shift

Y Sui, Q Wu, J Wu, Q Cui, L Li, J Zhou… - Advances in Neural …, 2024 - proceedings.neurips.cc
The issue of distribution shifts is emerging as a critical concern in graph representation
learning. From the perspective of invariant learning and stable learning, a recently well …

Does invariant graph learning via environment augmentation learn invariance?

Y Chen, Y Bian, K Zhou, B Xie… - Advances in Neural …, 2024 - proceedings.neurips.cc
Invariant graph representation learning aims to learn the invariance among data from
different environments for out-of-distribution generalization on graphs. As the graph …

Learning invariant graph representations for out-of-distribution generalization

H Li, Z Zhang, X Wang, W Zhu - Advances in Neural …, 2022 - proceedings.neurips.cc
Graph representation learning has shown effectiveness when testing and training graph
data come from the same distribution, but most existing approaches fail to generalize under …

Learning causally invariant representations for out-of-distribution generalization on graphs

Y Chen, Y Zhang, Y Bian, H Yang… - Advances in …, 2022 - proceedings.neurips.cc
Despite recent success in using the invariance principle for out-of-distribution (OOD)
generalization on Euclidean data (eg, images), studies on graph data are still limited …

Handling distribution shifts on graphs: An invariance perspective

Q Wu, H Zhang, J Yan, D Wipf - arXiv preprint arXiv:2202.02466, 2022 - arxiv.org
There is increasing evidence suggesting neural networks' sensitivity to distribution shifts, so
that research on out-of-distribution (OOD) generalization comes into the spotlight …

Size-invariant graph representations for graph classification extrapolations

B Bevilacqua, Y Zhou, B Ribeiro - … Conference on Machine …, 2021 - proceedings.mlr.press
In general, graph representation learning methods assume that the train and test data come
from the same distribution. In this work we consider an underexplored area of an otherwise …

FLOOD: A flexible invariant learning framework for out-of-distribution generalization on graphs

Y Liu, X Ao, F Feng, Y Ma, K Li, TS Chua… - Proceedings of the 29th …, 2023 - dl.acm.org
Graph Neural Networks (GNNs) have achieved remarkable success in various domains but
most of them are developed under the in-distribution assumption. Under out-of-distribution …

Augmentation-free graph contrastive learning of invariant-discriminative representations

H Li, J Cao, J Zhu, Q Luo, S He… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Graph contrastive learning (GCL) is a promising direction toward alleviating the label
dependence, poor generalization and weak robustness of graph neural networks, learning …

Environment-aware dynamic graph learning for out-of-distribution generalization

H Yuan, Q Sun, X Fu, Z Zhang, C Ji… - Advances in Neural …, 2024 - proceedings.neurips.cc
Dynamic graph neural networks (DGNNs) are increasingly pervasive in exploiting spatio-
temporal patterns on dynamic graphs. However, existing works fail to generalize under …

Graphde: A generative framework for debiased learning and out-of-distribution detection on graphs

Z Li, Q Wu, F Nie, J Yan - Advances in Neural Information …, 2022 - proceedings.neurips.cc
Despite the remarkable success of graph neural networks (GNNs) for graph representation
learning, they are generally built on the (unreliable) iid assumption across training and …