Large language models on graphs: A comprehensive survey

B Jin, G Liu, C Han, M Jiang, H Ji, J Han - arXiv preprint arXiv:2312.02783, 2023 - arxiv.org
Large language models (LLMs), such as ChatGPT and LLaMA, are creating significant
advancements in natural language processing, due to their strong text encoding/decoding …

A comprehensive survey on trustworthy recommender systems

W Fan, X Zhao, X Chen, J Su, J Gao, L Wang… - arXiv preprint arXiv …, 2022 - arxiv.org
As one of the most successful AI-powered applications, recommender systems aim to help
people make appropriate decisions in an effective and efficient way, by providing …

Graphformers: Gnn-nested transformers for representation learning on textual graph

J Yang, Z Liu, S Xiao, C Li, D Lian… - Advances in …, 2021 - proceedings.neurips.cc
The representation learning on textual graph is to generate low-dimensional embeddings for
the nodes based on the individual textual features and the neighbourhood information …

Efficiently leveraging multi-level user intent for session-based recommendation via atten-mixer network

P Zhang, J Guo, C Li, Y Xie, JB Kim, Y Zhang… - Proceedings of the …, 2023 - dl.acm.org
Session-based recommendation (SBR) aims to predict the user's next action based on short
and dynamic sessions. Recently, there has been an increasing interest in utilizing various …

Graphtext: Graph reasoning in text space

J Zhao, L Zhuo, Y Shen, M Qu, K Liu… - arXiv preprint arXiv …, 2023 - arxiv.org
Large Language Models (LLMs) have gained the ability to assimilate human knowledge and
facilitate natural language interactions with both humans and other LLMs. However, despite …

A comprehensive study on text-attributed graphs: Benchmarking and rethinking

H Yan, C Li, R Long, C Yan, J Zhao… - Advances in …, 2023 - proceedings.neurips.cc
Text-attributed graphs (TAGs) are prevalent in various real-world scenarios, where each
node is associated with a text description. The cornerstone of representation learning on …

Train your own gnn teacher: Graph-aware distillation on textual graphs

C Mavromatis, VN Ioannidis, S Wang, D Zheng… - … Conference on Machine …, 2023 - Springer
How can we learn effective node representations on textual graphs? Graph Neural Networks
(GNNs) that use Language Models (LMs) to encode textual information of graphs achieve …

Edgeformers: Graph-empowered transformers for representation learning on textual-edge networks

B Jin, Y Zhang, Y Meng, J Han - arXiv preprint arXiv:2302.11050, 2023 - arxiv.org
Edges in many real-world social/information networks are associated with rich text
information (eg, user-user communications or user-product reviews). However, mainstream …

AdaMCT: adaptive mixture of CNN-transformer for sequential recommendation

J Jiang, P Zhang, Y Luo, C Li, JB Kim, K Zhang… - Proceedings of the …, 2023 - dl.acm.org
Sequential recommendation (SR) aims to model users' dynamic preferences from a series of
interactions. A pivotal challenge in user modeling for SR lies in the inherent variability of …

Graph-aware language model pre-training on a large graph corpus can help multiple graph applications

H Xie, D Zheng, J Ma, H Zhang, VN Ioannidis… - Proceedings of the 29th …, 2023 - dl.acm.org
Model pre-training on large text corpora has been demonstrated effective for various
downstream applications in the NLP domain. In the graph mining domain, a similar analogy …