P Chu, X Bian, S Liu, H Ling - … Conference, Glasgow, UK, August 23–28 …, 2020 - Springer
Real-world data often follow a long-tailed distribution as the frequency of each class is typically different. For example, a dataset can have a large number of under-represented …
X Chen, Y Zhou, D Wu, C Yang, B Li… - Proceedings of the …, 2023 - openaccess.thecvf.com
Large-scale data from real-world usually follow a long-tailed distribution (ie, a few majority classes occupy plentiful training data, while most minority classes have few samples) …
L Zhang, M Yu, T Chen, Z Shi… - Proceedings of the …, 2020 - openaccess.thecvf.com
Training process is crucial for the deployment of the network in applications which have two strict requirements on both accuracy and robustness. However, most existing approaches …
Imbalanced datasets widely exist in practice and are a great challenge for training deep neural models with a good generalization on infrequent classes. In this work, we propose a …
Maximizing the separation between classes constitutes a well-known inductive bias in machine learning and a pillar of many traditional algorithms. By default, deep networks are …
K Cao, C Wei, A Gaidon… - Advances in neural …, 2019 - proceedings.neurips.cc
Deep learning algorithms can fare poorly when the training dataset suffers from heavy class- imbalance but the testing criterion requires good generalization on less frequent classes …
M Paul, S Ganguli… - Advances in neural …, 2021 - proceedings.neurips.cc
Recent success in deep learning has partially been driven by training increasingly overparametrized networks on ever larger datasets. It is therefore natural to ask: how much …
Deep image classifiers often perform poorly when training data are heavily class- imbalanced. In this work, we propose a new regularization technique, Remix, that relaxes …
H Wei, L Tao, R Xie, L Feng… - … Conference on Machine …, 2022 - proceedings.mlr.press
Deep neural networks usually perform poorly when the training dataset suffers from extreme class imbalance. Recent studies found that directly training with out-of-distribution data (ie …