A comprehensive survey on pretrained foundation models: A history from bert to chatgpt

C Zhou, Q Li, C Li, J Yu, Y Liu, G Wang… - International Journal of …, 2024 - Springer
Abstract Pretrained Foundation Models (PFMs) are regarded as the foundation for various
downstream tasks across different data modalities. A PFM (eg, BERT, ChatGPT, GPT-4) is …

Contrastive self-supervised learning: review, progress, challenges and future research directions

P Kumar, P Rawat, S Chauhan - International Journal of Multimedia …, 2022 - Springer
In the last decade, deep supervised learning has had tremendous success. However, its
flaws, such as its dependency on manual and costly annotations on large datasets and …

Graphgpt: Graph instruction tuning for large language models

J Tang, Y Yang, W Wei, L Shi, L Su, S Cheng… - Proceedings of the 47th …, 2024 - dl.acm.org
Graph Neural Networks (GNNs) have evolved to understand graph structures through
recursive exchanges and aggregations among nodes. To enhance robustness, self …

Graphmae: Self-supervised masked graph autoencoders

Z Hou, X Liu, Y Cen, Y Dong, H Yang, C Wang… - Proceedings of the 28th …, 2022 - dl.acm.org
Self-supervised learning (SSL) has been extensively explored in recent years. Particularly,
generative SSL has seen emerging success in natural language processing and other …

Data augmentation for deep graph learning: A survey

K Ding, Z Xu, H Tong, H Liu - ACM SIGKDD Explorations Newsletter, 2022 - dl.acm.org
Graph neural networks, a powerful deep learning tool to model graph-structured data, have
demonstrated remarkable performance on numerous graph learning tasks. To address the …

Simgrace: A simple framework for graph contrastive learning without data augmentation

J Xia, L Wu, J Chen, B Hu, SZ Li - … of the ACM Web Conference 2022, 2022 - dl.acm.org
Graph contrastive learning (GCL) has emerged as a dominant technique for graph
representation learning which maximizes the mutual information between paired graph …

Graph self-supervised learning: A survey

Y Liu, M Jin, S Pan, C Zhou, Y Zheng… - IEEE transactions on …, 2022 - ieeexplore.ieee.org
Deep learning on graphs has attracted significant interests recently. However, most of the
works have focused on (semi-) supervised learning, resulting in shortcomings including …

Pre-training molecular graph representation with 3d geometry

S Liu, H Wang, W Liu, J Lasenby, H Guo… - arXiv preprint arXiv …, 2021 - arxiv.org
Molecular graph representation learning is a fundamental problem in modern drug and
material discovery. Molecular graphs are typically modeled by their 2D topological …

Graphprompt: Unifying pre-training and downstream tasks for graph neural networks

Z Liu, X Yu, Y Fang, X Zhang - Proceedings of the ACM Web Conference …, 2023 - dl.acm.org
Graphs can model complex relationships between objects, enabling a myriad of Web
applications such as online page/article classification and social recommendation. While …

3d infomax improves gnns for molecular property prediction

H Stärk, D Beaini, G Corso, P Tossou… - International …, 2022 - proceedings.mlr.press
Molecular property prediction is one of the fastest-growing applications of deep learning with
critical real-world impacts. Although the 3D molecular graph structure is necessary for …