Generalization to out-of-distribution (OOD) data is a capability natural to humans yet challenging for machines to reproduce. This is because most learning algorithms strongly …
L Hoyer, D Dai, H Wang… - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
In unsupervised domain adaptation (UDA), a model trained on source data (eg synthetic) is adapted to target data (eg real-world) without access to target annotation. Most previous …
Test-time adaptation is a special setting of unsupervised domain adaptation where a trained model on the source domain has to adapt to the target domain without accessing source …
Recent text-to-image generation models have shown promising results in generating high- fidelity photo-realistic images. Though the results are astonishing to human eyes, how …
Y Liu, P Kothari, B Van Delft… - Advances in …, 2021 - proceedings.neurips.cc
Test-time training (TTT) through self-supervised learning (SSL) is an emerging paradigm to tackle distributional shifts. Despite encouraging results, it remains unclear when this …
M Zhang, S Levine, C Finn - Advances in neural information …, 2022 - proceedings.neurips.cc
While deep neural networks can attain good accuracy on in-distribution test points, many applications require robustness even in the face of unexpected perturbations in the input …
Abstract Domain adaptation (DA) aims to transfer the knowledge learned from source domain to an unlabeled target domain. Some recent works tackle source-free domain …
Massive web datasets play a key role in the success of large vision-language models like CLIP and Flamingo. However, the raw web data is noisy, and existing filtering methods to …
Unsupervised domain adaptation (UDA) aims to transfer knowledge learned from a labeled source domain to a different unlabeled target domain. Most existing UDA methods focus on …