Self-training with weak supervision

G Karamanolakis, S Mukherjee, G Zheng… - arXiv preprint arXiv …, 2021 - arxiv.org
State-of-the-art deep neural networks require large-scale labeled training data that is often
expensive to obtain or not available for many tasks. Weak supervision in the form of domain …

Why Is Prompt Tuning for Vision-Language Models Robust to Noisy Labels?

CE Wu, Y Tian, H Yu, H Wang… - Proceedings of the …, 2023 - openaccess.thecvf.com
Vision-language models such as CLIP learn a generic text-image embedding from large-
scale training data. A vision-language model can be adapted to a new classification task …

Noisy correspondence learning with meta similarity correction

H Han, K Miao, Q Zheng, M Luo - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
Despite the success of multimodal learning in cross-modal retrieval task, the remarkable
progress relies on the correct correspondence among multimedia data. However, collecting …

Coupled confusion correction: Learning from crowds with sparse annotations

H Zhang, S Li, D Zeng, C Yan, S Ge - Proceedings of the AAAI …, 2024 - ojs.aaai.org
As the size of the datasets getting larger, accurately annotating such datasets is becoming
more impractical due to the expensiveness on both time and economy. Therefore, crowd …

Promix: Combating label noise via maximizing clean sample utility

H Wang, R Xiao, Y Dong, L Feng, J Zhao - arXiv preprint arXiv:2207.10276, 2022 - arxiv.org
The ability to train deep neural networks under label noise is appealing, as imperfectly
annotated data are relatively cheaper to obtain. State-of-the-art approaches are based on …

Annotation error detection: Analyzing the past and present for a more coherent future

JC Klie, B Webber, I Gurevych - Computational Linguistics, 2023 - direct.mit.edu
Annotated data is an essential ingredient in natural language processing for training and
evaluating machine learning models. It is therefore very desirable for the annotations to be …

Estimating instance-dependent bayes-label transition matrix using a deep neural network

S Yang, E Yang, B Han, Y Liu, M Xu… - International …, 2022 - proceedings.mlr.press
In label-noise learning, estimating the transition matrix is a hot topic as the matrix plays an
important role in building statistically consistent classifiers. Traditionally, the transition from …

Learning from noisy labels with decoupled meta label purifier

Y Tu, B Zhang, Y Li, L Liu, J Li… - Proceedings of the …, 2023 - openaccess.thecvf.com
Training deep neural networks (DNN) with noisy labels is challenging since DNN can easily
memorize inaccurate labels, leading to poor generalization ability. Recently, the meta …

From instance to metric calibration: A unified framework for open-world few-shot learning

Y An, H Xue, X Zhao, J Wang - IEEE Transactions on Pattern …, 2023 - ieeexplore.ieee.org
Robust few-shot learning (RFSL), which aims to address noisy labels in few-shot learning,
has recently gained considerable attention. Existing RFSL methods are based on the …

From noisy prediction to true label: Noisy prediction calibration via generative model

HS Bae, S Shin, B Na, JH Jang… - International …, 2022 - proceedings.mlr.press
Noisy labels are inevitable yet problematic in machine learning society. It ruins the
generalization of a classifier by making the classifier over-fitted to noisy labels. Existing …