Extracting low-/high-frequency knowledge from graph neural networks and injecting it into mlps: An effective gnn-to-mlp distillation framework

L Wu, H Lin, Y Huang, T Fan, SZ Li - … of the AAAI Conference on Artificial …, 2023 - ojs.aaai.org
Recent years have witnessed the great success of Graph Neural Networks (GNNs) in
handling graph-related tasks. However, MLPs remain the primary workhorse for practical …

Graph-based knowledge distillation: A survey and experimental evaluation

J Liu, T Zheng, G Zhang, Q Hao - arXiv preprint arXiv:2302.14643, 2023 - arxiv.org
Graph, such as citation networks, social networks, and transportation networks, are
prevalent in the real world. Graph Neural Networks (GNNs) have gained widespread …

Graph Decipher: A transparent dual‐attention graph neural network to understand the message‐passing mechanism for the node classification

Y Pang, T Huang, Z Wang, J Li… - … Journal of Intelligent …, 2022 - Wiley Online Library
Graph neural networks (GNNs) can be effectively applied to solve many real‐world
problems across widely diverse fields. Their success is inseparable from the message …

Learning to Model Graph Structural Information on MLPs via Graph Structure Self-Contrasting

L Wu, H Lin, G Zhao, C Tan, SZ Li - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Recent years have witnessed great success in handling graph-related tasks with graph
neural networks (GNNs). However, most existing GNNs are based on message passing to …

Decoupling Weighing and Selecting for Integrating Multiple Graph Pre-training Tasks

T Fan, L Wu, Y Huang, H Lin, C Tan, Z Gao… - arXiv preprint arXiv …, 2024 - arxiv.org
Recent years have witnessed the great success of graph pre-training for graph
representation learning. With hundreds of graph pre-training tasks proposed, integrating …

Clarify confused nodes via separated learning

J Zhou, S Gong, X Chen, C Xie, S Yu… - … on Pattern Analysis …, 2025 - ieeexplore.ieee.org
Graph neural networks (GNNs) have achieved remarkable advances in graph-oriented
tasks. However, real-world graphs invariably contain a certain proportion of heterophilous …

Teaching yourself: Graph self-distillation on neighborhood for node classification

L Wu, J Xia, H Lin, Z Gao, Z Liu, G Zhao… - arXiv preprint arXiv …, 2022 - arxiv.org
Recent years have witnessed great success in handling graph-related tasks with Graph
Neural Networks (GNNs). Despite their great academic success, Multi-Layer Perceptrons …

Automated graph self-supervised learning via multi-teacher knowledge distillation

L Wu, Y Huang, H Lin, Z Liu, T Fan, SZ Li - arXiv preprint arXiv:2210.02099, 2022 - arxiv.org
Self-supervised learning on graphs has recently achieved remarkable success in graph
representation learning. With hundreds of self-supervised pretext tasks proposed over the …

A Teacher-Free Graph Knowledge Distillation Framework with Dual Self-Distillation

L Wu, H Lin, Z Gao, G Zhao, SZ Li - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Recent years have witnessed great success in handling graph-related tasks with Graph
Neural Networks (GNNs). Despite their great academic success, Multi-Layer Perceptrons …

MS-GDA: Improving Heterogeneous Recipe Representation via Multinomial Sampling Graph Data Augmentation

L Chen, W Li, X Cui, Z Wang, S Berretti… - ACM Transactions on …, 2024 - dl.acm.org
We study the problem of classifying different cooking styles, based on the recipe. The
difficulty is that the same food ingredients, seasoning, and the very similar instructions result …