Let Go of Your Labels with Unsupervised Transfer

A Gadetsky, Y Jiang, M Brbic - arXiv preprint arXiv:2406.07236, 2024 - arxiv.org
Foundation vision-language models have enabled remarkable zero-shot transferability of
the pre-trained representations to a wide range of downstream tasks. However, to solve a …

Cross-domain Open-world Discovery

S Wen, M Brbic - arXiv preprint arXiv:2406.11422, 2024 - arxiv.org
In many real-world applications, test data may commonly exhibit categorical shifts,
characterized by the emergence of novel classes, as well as distribution shifts arising from …

Fine-grained Classes and How to Find Them

M Grcić, A Gadetsky, M Brbić - arXiv preprint arXiv:2406.11070, 2024 - arxiv.org
In many practical applications, coarse-grained labels are readily available compared to fine-
grained labels that reflect subtle differences between classes. However, existing methods …

Fine-grained Classes and How to Find Them

M Grcic, A Gadetsky, M Brbic - Forty-first International Conference on … - openreview.net
In many practical applications, coarse-grained labels are readily available compared to fine-
grained labels that reflect subtle differences between classes. However, existing methods …

Self-Supervised Learning for Unsupervised Image Classification and Supervised Localization Tasks

M Baydar - 2024 - search.proquest.com
Recent self-supervised learning methods, where instance discrimination is a fundamental
pretraining task for convolutional neural networks (CNNs), excel in transfer learning. While …