An empirical study of graph contrastive learning

Y Zhu, Y Xu, Q Liu, S Wu - arXiv preprint arXiv:2109.01116, 2021 - arxiv.org
Graph Contrastive Learning (GCL) establishes a new paradigm for learning graph
representations without human annotations. Although remarkable progress has been …

Is a caption worth a thousand images? a controlled study for representation learning

S Santurkar, Y Dubois, R Taori, P Liang… - arXiv preprint arXiv …, 2022 - arxiv.org
The development of CLIP [Radford et al., 2021] has sparked a debate on whether language
supervision can result in vision models with more transferable representations than …

Contrasting contrastive self-supervised representation learning pipelines

K Kotar, G Ilharco, L Schmidt… - Proceedings of the …, 2021 - openaccess.thecvf.com
In the past few years, we have witnessed remarkable breakthroughs in self-supervised
representation learning. Despite the success and adoption of representations learned …

Is self-supervised learning more robust than supervised learning?

Y Zhong, H Tang, J Chen, J Peng, YX Wang - arXiv preprint arXiv …, 2022 - arxiv.org
Self-supervised contrastive learning is a powerful tool to learn visual representation without
labels. Prior work has primarily focused on evaluating the recognition accuracy of various …

Beyond supervised vs. unsupervised: Representative benchmarking and analysis of image representation learning

M Gwilliam, A Shrivastava - … of the IEEE/CVF Conference on …, 2022 - openaccess.thecvf.com
By leveraging contrastive learning, clustering, and other pretext tasks, unsupervised
methods for learning image representations have reached impressive results on standard …

Learning representations with contrastive self-supervised learning for histopathology applications

K Stacke, J Unger, C Lundström, G Eilertsen - arXiv preprint arXiv …, 2021 - arxiv.org
Unsupervised learning has made substantial progress over the last few years, especially by
means of contrastive self-supervised learning. The dominating dataset for benchmarking self …

Contrastive Learning Relies More on Spatial Inductive Bias Than Supervised Learning: An Empirical Study

Y Zhong, H Tang, JK Chen… - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
Though self-supervised contrastive learning (CL) has shown its potential to achieve state-of-
the-art accuracy without any supervision, its behavior still remains under investigated by …

An empirical analysis for zero-shot multi-label classification on covid-19 ct scans and uncurated reports

E Dack, L Brigato, M McMurray… - Proceedings of the …, 2023 - openaccess.thecvf.com
The pandemic resulted in vast repositories of unstructured data, including radiology reports,
due to increased medical examinations. Previous research on automated diagnosis of …

Enhancing 2D Representation Learning with a 3D Prior

M Aygun, P Dhar, Z Yan… - Proceedings of the …, 2024 - openaccess.thecvf.com
Learning robust and effective representations of visual data is a fundamental task in
computer vision. Traditionally this is achieved by training models with expensive to obtain …

Feature dropout: Revisiting the role of augmentations in contrastive learning

A Tamkin, M Glasgow, X He… - Advances in Neural …, 2024 - proceedings.neurips.cc
What role do augmentations play in contrastive learning? Recent work suggests that good
augmentations are label-preserving with respect to a specific downstream task. We …