State-of-the-art Graph Neural Networks (GNNs) have limited scalability with respect to the graph and model sizes. On large graphs, increasing the model depth often means …
A Brutzkus, A Globerson - International Conference on …, 2019 - proceedings.mlr.press
Empirical evidence suggests that neural networks with ReLU activations generalize better with over-parameterization. However, there is currently no theoretical analysis that explains …
Several recent works use positional encodings to extend the receptive fields of graph neural network (GNN) layers equipped with attention mechanisms. These techniques, however …
CY Chuang, S Jegelka - Advances in Neural Information …, 2022 - proceedings.neurips.cc
Understanding generalization and robustness of machine learning models fundamentally relies on assuming an appropriate metric on the data space. Identifying such a metric is …
D Buffelli, P Liò, F Vandin - Advances in Neural Information …, 2022 - proceedings.neurips.cc
In the past few years, graph neural networks (GNNs) have become the de facto model of choice for graph classification. While, from the theoretical viewpoint, most GNNs can operate …
Graph neural networks (GNNs) have become a popular approach to integrating structural inductive biases into NLP models. However, there has been little work on interpreting them …
W Cong, M Ramezani… - Advances in Neural …, 2021 - proceedings.neurips.cc
Abstract Graph Convolutional Networks (GCNs) are known to suffer from performance degradation as the number of layers increases, which is usually attributed to over …
F Diehl - arXiv preprint arXiv:1905.10990, 2019 - arxiv.org
Graph Neural Network (GNN) research has concentrated on improving convolutional layers, with little attention paid to developing graph pooling layers. Yet pooling layers can enable …
Normalization is known to help the optimization of deep neural networks. Curiously, different architectures require specialized normalization methods. In this paper, we study what …