Instance-dependent noisy label learning via graphical modelling

A Garg, C Nguyen, R Felix, TT Do… - Proceedings of the …, 2023 - openaccess.thecvf.com
Noisy labels are unavoidable yet troublesome in the ecosystem of deep learning because
models can easily overfit them. There are many types of label noise, such as symmetric …

Weak proxies are sufficient and preferable for fairness with missing sensitive attributes

Z Zhu, Y Yao, J Sun, H Li, Y Liu - … Conference on Machine …, 2023 - proceedings.mlr.press
Evaluating fairness can be challenging in practice because the sensitive attributes of data
are often inaccessible due to privacy constraints. The go-to approach that the industry …

Noise-robust fine-tuning of pretrained language models via external guidance

S Wang, Z Tan, R Guo, J Li - arXiv preprint arXiv:2311.01108, 2023 - arxiv.org
Adopting a two-stage paradigm of pretraining followed by fine-tuning, Pretrained Language
Models (PLMs) have achieved substantial advancements in the field of natural language …

Explaining generalization power of a dnn using interactive concepts

H Zhou, H Zhang, H Deng, D Liu, W Shen… - Proceedings of the …, 2024 - ojs.aaai.org
This paper explains the generalization power of a deep neural network (DNN) from the
perspective of interactions. Although there is no universally-accepted definition of the …

Listwise generative retrieval models via a sequential learning process

Y Tang, R Zhang, J Guo, M de Rijke, W Chen… - ACM Transactions on …, 2024 - dl.acm.org
Recently, a novel generative retrieval (GR) paradigm has been proposed, where a single
sequence-to-sequence model is learned to directly generate a list of relevant document …

Regularly truncated m-estimators for learning with noisy labels

X Xia, P Lu, C Gong, B Han, J Yu… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
The sample selection approach is very popular in learning with noisy labels. As deep
networks “learn pattern first”, prior methods built on sample selection share a similar training …

Hide and seek in noise labels: Noise-robust collaborative active learning with LLMs-powered assistance

B Yuan, Y Chen, Y Zhang, W Jiang - Proceedings of the 62nd …, 2024 - aclanthology.org
Learning from noisy labels (LNL) is a challenge that arises in many real-world scenarios
where collected training data can contain incorrect or corrupted labels. Most existing …

Dygen: Learning from noisy labels via dynamics-enhanced generative modeling

Y Zhuang, Y Yu, L Kong, X Chen, C Zhang - Proceedings of the 29th …, 2023 - dl.acm.org
Learning from noisy labels is a challenge that arises in many real-world applications where
training data can contain incorrect or corrupted labels. When fine-tuning language models …

Dissecting sample hardness: A fine-grained analysis of hardness characterization methods for data-centric AI

N Seedat, F Imrie, M van der Schaar - arXiv preprint arXiv:2403.04551, 2024 - arxiv.org
Characterizing samples that are difficult to learn from is crucial to developing highly
performant ML models. This has led to numerous Hardness Characterization Methods …

Dynamic selection for reconstructing instance-dependent noisy labels

J Yang, X Niu, Y Xu, Z Zhang, G Guo, S Drew… - Pattern Recognition, 2024 - Elsevier
As an inevitable issue in annotating large-scale datasets, instance-dependent label noise
(IDN) can cause serious overfitting in neural networks. To combat IDN, label reconstruction …