A comprehensive survey on deep graph representation learning

W Ju, Z Fang, Y Gu, Z Liu, Q Long, Z Qiao, Y Qin… - Neural Networks, 2024 - Elsevier
Graph representation learning aims to effectively encode high-dimensional sparse graph-
structured data into low-dimensional dense vectors, which is a fundamental task that has …

A survey on graph kernels

NM Kriege, FD Johansson, C Morris - Applied Network Science, 2020 - Springer
Graph kernels have become an established and widely-used technique for solving
classification tasks on graphs. This survey gives a comprehensive overview of techniques …

Graph neural networks: foundation, frontiers and applications

L Wu, P Cui, J Pei, L Zhao, X Guo - … of the 28th ACM SIGKDD Conference …, 2022 - dl.acm.org
The field of graph neural networks (GNNs) has seen rapid and incredible strides over the
recent years. Graph neural networks, also known as deep learning on graphs, graph …

Identity-aware graph neural networks

J You, JM Gomes-Selman, R Ying… - Proceedings of the AAAI …, 2021 - ojs.aaai.org
Abstract Message passing Graph Neural Networks (GNNs) provide a powerful modeling
framework for relational data. However, the expressive power of existing GNNs is upper …

Substructure aware graph neural networks

D Zeng, W Liu, W Chen, L Zhou, M Zhang… - Proceedings of the AAAI …, 2023 - ojs.aaai.org
Despite the great achievements of Graph Neural Networks (GNNs) in graph learning,
conventional GNNs struggle to break through the upper limit of the expressiveness of first …

A new perspective on" how graph neural networks go beyond weisfeiler-lehman?"

A Wijesinghe, Q Wang - International Conference on Learning …, 2022 - openreview.net
We propose a new perspective on designing powerful Graph Neural Networks (GNNs). In a
nutshell, this enables a general solution to inject structural properties of graphs into a …

Graph neural tangent kernel: Fusing graph neural networks with graph kernels

SS Du, K Hou, RR Salakhutdinov… - Advances in neural …, 2019 - proceedings.neurips.cc
While graph kernels (GKs) are easy to train and enjoy provable theoretical guarantees, their
practical performances are limited by their expressive power, as the kernel function often …

Facilitating graph neural networks with random walk on simplicial complexes

C Zhou, X Wang, M Zhang - Advances in Neural …, 2024 - proceedings.neurips.cc
Node-level random walk has been widely used to improve Graph Neural Networks.
However, there is limited attention to random walk on edge and, more generally, on $ k …

Sign and basis invariant networks for spectral graph representation learning

D Lim, J Robinson, L Zhao, T Smidt, S Sra… - arXiv preprint arXiv …, 2022 - arxiv.org
We introduce SignNet and BasisNet--new neural architectures that are invariant to two key
symmetries displayed by eigenvectors:(i) sign flips, since if $ v $ is an eigenvector then so is …

Random walk graph neural networks

G Nikolentzos, M Vazirgiannis - Advances in Neural …, 2020 - proceedings.neurips.cc
In recent years, graph neural networks (GNNs) have become the de facto tool for performing
machine learning tasks on graphs. Most GNNs belong to the family of message passing …