Class-balanced loss based on effective number of samples

Y Cui, M Jia, TY Lin, Y Song… - Proceedings of the …, 2019 - openaccess.thecvf.com
With the rapid increase of large-scale, real-world datasets, it becomes critical to address the
problem of long-tailed data distribution (ie, a few classes account for most of the data, while …

Feature space augmentation for long-tailed data

P Chu, X Bian, S Liu, H Ling - … Conference, Glasgow, UK, August 23–28 …, 2020 - Springer
Real-world data often follow a long-tailed distribution as the frequency of each class is
typically different. For example, a dataset can have a large number of under-represented …

Area: adaptive reweighting via effective area for long-tailed classification

X Chen, Y Zhou, D Wu, C Yang, B Li… - Proceedings of the …, 2023 - openaccess.thecvf.com
Large-scale data from real-world usually follow a long-tailed distribution (ie, a few majority
classes occupy plentiful training data, while most minority classes have few samples) …

Auxiliary training: Towards accurate and robust models

L Zhang, M Yu, T Chen, Z Shi… - Proceedings of the …, 2020 - openaccess.thecvf.com
Training process is crucial for the deployment of the network in applications which have two
strict requirements on both accuracy and robustness. However, most existing approaches …

Rsg: A simple but effective module for learning imbalanced datasets

J Wang, T Lukasiewicz, X Hu… - Proceedings of the …, 2021 - openaccess.thecvf.com
Imbalanced datasets widely exist in practice and are a great challenge for training deep
neural models with a good generalization on infrequent classes. In this work, we propose a …

Maximum class separation as inductive bias in one matrix

T Kasarla, G Burghouts… - Advances in neural …, 2022 - proceedings.neurips.cc
Maximizing the separation between classes constitutes a well-known inductive bias in
machine learning and a pillar of many traditional algorithms. By default, deep networks are …

Learning imbalanced datasets with label-distribution-aware margin loss

K Cao, C Wei, A Gaidon… - Advances in neural …, 2019 - proceedings.neurips.cc
Deep learning algorithms can fare poorly when the training dataset suffers from heavy class-
imbalance but the testing criterion requires good generalization on less frequent classes …

Deep learning on a data diet: Finding important examples early in training

M Paul, S Ganguli… - Advances in neural …, 2021 - proceedings.neurips.cc
Recent success in deep learning has partially been driven by training increasingly
overparametrized networks on ever larger datasets. It is therefore natural to ask: how much …

Remix: rebalanced mixup

HP Chou, SC Chang, JY Pan, W Wei… - Computer Vision–ECCV …, 2020 - Springer
Deep image classifiers often perform poorly when training data are heavily class-
imbalanced. In this work, we propose a new regularization technique, Remix, that relaxes …

Open-sampling: Exploring out-of-distribution data for re-balancing long-tailed datasets

H Wei, L Tao, R Xie, L Feng… - … Conference on Machine …, 2022 - proceedings.mlr.press
Deep neural networks usually perform poorly when the training dataset suffers from extreme
class imbalance. Recent studies found that directly training with out-of-distribution data (ie …