Long-tail learning via logit adjustment

AK Menon, S Jayasumana, AS Rawat, H Jain… - arXiv preprint arXiv …, 2020 - arxiv.org
Real-world classification problems typically exhibit an imbalanced or long-tailed label
distribution, wherein many labels are associated with only a few samples. This poses a …

Robust long-tailed learning under label noise

T Wei, JX Shi, WW Tu, YF Li - arXiv preprint arXiv:2108.11569, 2021 - arxiv.org
Long-tailed learning has attracted much attention recently, with the goal of improving
generalisation for tail classes. Most existing works use supervised learning without …

No one left behind: Improving the worst categories in long-tailed learning

Y Du, J Wu - Proceedings of the IEEE/CVF conference on …, 2023 - openaccess.thecvf.com
Unlike the case when using a balanced training dataset, the per-class recall (ie, accuracy) of
neural networks trained with an imbalanced dataset are known to vary a lot from category to …

Balanced product of calibrated experts for long-tailed recognition

ES Aimar, A Jonnarth, M Felsberg… - Proceedings of the …, 2023 - openaccess.thecvf.com
Many real-world recognition problems are characterized by long-tailed label distributions.
These distributions make representation learning highly challenging due to limited …

Long-tailed partial label learning via dynamic rebalancing

F Hong, J Yao, Z Zhou, Y Zhang, Y Wang - arXiv preprint arXiv …, 2023 - arxiv.org
Real-world data usually couples the label ambiguity and heavy imbalance, challenging the
algorithmic robustness of partial label learning (PLL) and long-tailed learning (LT). The …

Towards calibrated model for long-tailed visual recognition from prior perspective

Z Xu, Z Chai, C Yuan - Advances in Neural Information …, 2021 - proceedings.neurips.cc
Real-world data universally confronts a severe class-imbalance problem and exhibits a long-
tailed distribution, ie, most labels are associated with limited instances. The naïve models …

Promix: Combating label noise via maximizing clean sample utility

R Xiao, Y Dong, H Wang, L Feng, R Wu… - arXiv preprint arXiv …, 2022 - arxiv.org
Learning with Noisy Labels (LNL) has become an appealing topic, as imperfectly annotated
data are relatively cheaper to obtain. Recent state-of-the-art approaches employ specific …

[PDF][PDF] Robust early-learning: Hindering the memorization of noisy labels

X Xia, T Liu, B Han, C Gong, N Wang… - International …, 2020 - drive.google.com
The memorization effects of deep networks show that they will first memorize training data
with clean labels and then those with noisy labels. The early stopping method therefore can …

A survey of label-noise representation learning: Past, present and future

B Han, Q Yao, T Liu, G Niu, IW Tsang, JT Kwok… - arXiv preprint arXiv …, 2020 - arxiv.org
Classical machine learning implicitly assumes that labels of the training data are sampled
from a clean distribution, which can be too restrictive for real-world scenarios. However …

Sample selection with uncertainty of losses for learning with noisy labels

X Xia, T Liu, B Han, M Gong, J Yu, G Niu… - arXiv preprint arXiv …, 2021 - arxiv.org
In learning with noisy labels, the sample selection approach is very popular, which regards
small-loss data as correctly labeled during training. However, losses are generated on-the …