Disc: Learning from noisy labels via dynamic instance-specific selection and correction

Y Li, H Han, S Shan, X Chen - Proceedings of the IEEE/CVF …, 2023 - openaccess.thecvf.com
Existing studies indicate that deep neural networks (DNNs) can eventually memorize the
label noise. We observe that the memorization strength of DNNs towards each instance is …

Pivotal: Prior-driven supervision for weakly-supervised temporal action localization

MN Rizve, G Mittal, Y Yu, M Hall… - Proceedings of the …, 2023 - openaccess.thecvf.com
Abstract Weakly-supervised Temporal Action Localization (WTAL) attempts to localize the
actions in untrimmed videos using only video-level supervision. Most recent works approach …

Opencon: Open-world contrastive learning

Y Sun, Y Li - arXiv preprint arXiv:2208.02764, 2022 - arxiv.org
Machine learning models deployed in the wild naturally encounter unlabeled samples from
both known and novel classes. Challenges arise in learning from both the labeled and …

Graph matching with bi-level noisy correspondence

Y Lin, M Yang, J Yu, P Hu… - Proceedings of the …, 2023 - openaccess.thecvf.com
In this paper, we study a novel and widely existing problem in graph matching (GM), namely,
Bi-level Noisy Correspondence (BNC), which refers to node-level noisy correspondence …

RankMatch: Fostering confidence and consistency in learning with noisy labels

Z Zhang, W Chen, C Fang, Z Li… - Proceedings of the …, 2023 - openaccess.thecvf.com
Learning with noisy labels (LNL) is one of the most important and challenging problems in
weakly-supervised learning. Recent advances adopt the sample selection strategy to …

Late stopping: Avoiding confidently learning from mislabeled examples

S Yuan, L Feng, T Liu - Proceedings of the IEEE/CVF …, 2023 - openaccess.thecvf.com
Sample selection is a prevalent method in learning with noisy labels, where small-loss data
are typically considered as correctly labeled data. However, this method may not effectively …

Learning pseudo-relations for cross-domain semantic segmentation

D Zhao, S Wang, Q Zang, D Quan… - Proceedings of the …, 2023 - openaccess.thecvf.com
Abstract Domain adaptive semantic segmentation aims to adapt a model trained on labeled
source domain to the unlabeled target domain. Self-training shows competitive potential in …

Promix: Combating label noise via maximizing clean sample utility

H Wang, R Xiao, Y Dong, L Feng, J Zhao - arXiv preprint arXiv:2207.10276, 2022 - arxiv.org
The ability to train deep neural networks under label noise is appealing, as imperfectly
annotated data are relatively cheaper to obtain. State-of-the-art approaches are based on …

Csot: Curriculum and structure-aware optimal transport for learning with noisy labels

W Chang, Y Shi, J Wang - Advances in Neural Information …, 2023 - proceedings.neurips.cc
Learning with noisy labels (LNL) poses a significant challenge in training a well-generalized
model while avoiding overfitting to corrupted labels. Recent advances have achieved …

Adaptive integration of partial label learning and negative learning for enhanced noisy label learning

M Sheng, Z Sun, Z Cai, T Chen, Y Zhou… - Proceedings of the AAAI …, 2024 - ojs.aaai.org
There has been significant attention devoted to the effectiveness of various domains, such
as semi-supervised learning, contrastive learning, and meta-learning, in enhancing the …