Pseudo labels for unsupervised domain adaptation: A review

Y Li, L Guo, Y Ge - Electronics, 2023 - mdpi.com
Conventional machine learning relies on two presumptions:(1) the training and testing
datasets follow the same independent distribution, and (2) an adequate quantity of samples …

Semi-supervised domain adaptation with source label adaptation

YC Yu, HT Lin - Proceedings of the IEEE/CVF Conference …, 2023 - openaccess.thecvf.com
Abstract Semi-Supervised Domain Adaptation (SSDA) involves learning to classify unseen
target data with a few labeled and lots of unlabeled target data, along with many labeled …

Jaws: Auditing predictive uncertainty under covariate shift

D Prinster, A Liu, S Saria - Advances in Neural Information …, 2022 - proceedings.neurips.cc
Abstract We propose\textbf {JAWS}, a series of wrapper methods for distribution-free
uncertainty quantification tasks under covariate shift, centered on the core method\textbf …

Contrasting augmented features for domain adaptation with limited target domain data

X Yu, X Gu, J Sun - Pattern Recognition, 2024 - Elsevier
Abstract Domain adaptation aims to alleviate distribution gaps between source and target
domains. However, when the available target domain data are scarce for training, learning …

Carlane: A lane detection benchmark for unsupervised domain adaptation from simulation to multiple real-world domains

B Stuhr, J Haselberger… - Advances in Neural …, 2022 - proceedings.neurips.cc
Abstract Unsupervised Domain Adaptation demonstrates great potential to mitigate domain
shifts by transferring models from labeled source domains to unlabeled target domains …

Class-wise Prototype Guided Alignment Network for Cross-Scene Hyperspectral Image Classification

Z Xie, P Duan, X Kang, W Liu… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
In the past few years, there has been significant progress in hyperspectral image
classification (HSIC). However, when the trained classifier on the source scene is directly …

Unexplainable explanations: Towards interpreting tSNE and UMAP embeddings

A Draganov, S Dohn - arXiv preprint arXiv:2306.11898, 2023 - arxiv.org
It has become standard to explain neural network latent spaces with attraction/repulsion
dimensionality reduction (ARDR) methods like tSNE and UMAP. This relies on the premise …

Exploiting inter-sample affinity for knowability-aware universal domain adaptation

Y Wang, L Zhang, R Song, H Li, PL Rosin… - International Journal of …, 2024 - Springer
Universal domain adaptation aims to transfer the knowledge of common classes from the
source domain to the target domain without any prior knowledge on the label set, which …

Revisiting unsupervised domain adaptation models: A smoothness perspective

X Wang, J Zhuo, M Zhang, S Wang… - Proceedings of the …, 2022 - openaccess.thecvf.com
Abstract Unsupervised Domain Adaptation (UDA) aims to leverage the labeled source data
and unlabeled target data to generalize better in the target domain. UDA methods utilize …

A Virtual-Label-Based Hierarchical Domain Adaptation Method for Time-Series Classification

W Yang, L Cheng, M Ragab, M Wu… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Unsupervised domain adaptation (UDA) is becoming a prominent solution for the domain-
shift problem in many time-series classification tasks. With sequence properties, time-series …