Open-sampling: Exploring out-of-distribution data for re-balancing long-tailed datasets

H Wei, L Tao, R Xie, L Feng… - … Conference on Machine …, 2022 - proceedings.mlr.press
Deep neural networks usually perform poorly when the training dataset suffers from extreme
class imbalance. Recent studies found that directly training with out-of-distribution data (ie …

Simplifying neural network training under class imbalance

R Shwartz-Ziv, M Goldblum, Y Li… - Advances in Neural …, 2024 - proceedings.neurips.cc
Real-world datasets are often highly class-imbalanced, which can adversely impact the
performance of deep learning models. The majority of research on training neural networks …

M2m: Imbalanced classification via major-to-minor translation

J Kim, J Jeong, J Shin - … of the IEEE/CVF conference on …, 2020 - openaccess.thecvf.com
In most real-world scenarios, labeled training datasets are highly class-imbalanced, where
deep neural networks suffer from generalizing to a balanced testing criterion. In this paper …

Rethinking the value of labels for improving class-imbalanced learning

Y Yang, Z Xu - Advances in neural information processing …, 2020 - proceedings.neurips.cc
Real-world data often exhibits long-tailed distributions with heavy class imbalance, posing
great challenges for deep recognition models. We identify a persisting dilemma on the value …

Cuda: Curriculum of data augmentation for long-tailed recognition

S Ahn, J Ko, SY Yun - arXiv preprint arXiv:2302.05499, 2023 - arxiv.org
Class imbalance problems frequently occur in real-world tasks, and conventional deep
learning algorithms are well known for performance degradation on imbalanced training …

Procrustean training for imbalanced deep learning

HJ Ye, DC Zhan, WL Chao - Proceedings of the IEEE/CVF …, 2021 - openaccess.thecvf.com
Neural networks trained with class-imbalanced data are known to perform poorly on minor
classes of scarce training data. Several recent works attribute this to over-fitting to minor …

Distribution alignment optimization through neural collapse for long-tailed classification

J Gao, H Zhao, D dan Guo, H Zha - Forty-first International …, 2024 - openreview.net
A well-trained deep neural network on balanced datasets usually exhibits the Neural
Collapse (NC) phenomenon, which is an informative indicator of the model achieving good …

Escaping saddle points for effective generalization on class-imbalanced data

H Rangwani, SK Aithal… - Advances in Neural …, 2022 - proceedings.neurips.cc
Real-world datasets exhibit imbalances of varying types and degrees. Several techniques
based on re-weighting and margin adjustment of loss are often used to enhance the …

Inducing neural collapse in deep long-tailed learning

X Liu, J Zhang, T Hu, H Cao, Y Yao… - … Conference on Artificial …, 2023 - proceedings.mlr.press
Although deep neural networks achieve tremendous success on various classification tasks,
the generalization ability drops sheer when training datasets exhibit long-tailed distributions …

Feature space augmentation for long-tailed data

P Chu, X Bian, S Liu, H Ling - … Conference, Glasgow, UK, August 23–28 …, 2020 - Springer
Real-world data often follow a long-tailed distribution as the frequency of each class is
typically different. For example, a dataset can have a large number of under-represented …