Deep learning on graphs has attracted significant interests recently. However, most of the works have focused on (semi-) supervised learning, resulting in shortcomings including …
N Lee, J Lee, C Park - Proceedings of the AAAI conference on artificial …, 2022 - ojs.aaai.org
Inspired by the recent success of self-supervised methods applied on images, self- supervised learning on graph structured data has seen rapid growth especially centered on …
Self-supervised learning (SSL) has been demonstrated to be effective in pre-training models that can be generalized to various downstream tasks. Graph Autoencoder (GAE), an …
Graph contrastive learning (GCL) improves graph representation learning, leading to SOTA on various downstream tasks. The graph augmentation step is a vital but scarcely studied …
Graph contrastive learning (GCL) alleviates the heavy reliance on label information for graph representation learning (GRL) via self-supervised learning schemes. The core idea is …
Although augmentations (eg, perturbation of graph edges, image crops) boost the efficiency of Contrastive Learning (CL), feature level augmentation is another plausible …
Contrastive learning has recently attracted plenty of attention in deep graph clustering due to its promising performance. However, complicated data augmentations and time-consuming …
Data augmentation has recently seen increased interest in graph machine learning given its demonstrated ability to improve model performance and generalization by added training …
Knowledge graph embedding (KGE) aims at learning powerful representations to benefit various artificial intelligence applications. Meanwhile, contrastive learning has been widely …