[HTML][HTML] A too-good-to-be-true prior to reduce shortcut reliance

N Dagaev, BD Roads, X Luo, DN Barry, KR Patil… - Pattern recognition …, 2023 - Elsevier
Despite their impressive performance in object recognition and other tasks under standard
testing conditions, deep networks often fail to generalize to out-of-distribution (ood) samples …

Codes: Chamfer out-of-distribution examples against overconfidence issue

K Tang, D Miao, W Peng, J Wu, Y Shi… - Proceedings of the …, 2021 - openaccess.thecvf.com
Overconfident predictions on out-of-distribution (OOD) samples is a thorny issue for deep
neural networks. The key to resolve the OOD overconfidence issue inherently is to build a …

Long-tailed recognition by routing diverse distribution-aware experts

X Wang, L Lian, Z Miao, Z Liu, SX Yu - arXiv preprint arXiv:2010.01809, 2020 - arxiv.org
Natural data are often long-tail distributed over semantic classes. Existing recognition
methods tackle this imbalanced classification by placing more emphasis on the tail data …

Delving into out-of-distribution detection with vision-language representations

Y Ming, Z Cai, J Gu, Y Sun, W Li… - Advances in neural …, 2022 - proceedings.neurips.cc
Recognizing out-of-distribution (OOD) samples is critical for machine learning systems
deployed in the open world. The vast majority of OOD detection methods are driven by a …

Mos: Towards scaling out-of-distribution detection for large semantic space

R Huang, Y Li - Proceedings of the IEEE/CVF Conference …, 2021 - openaccess.thecvf.com
Detecting out-of-distribution (OOD) inputs is a central challenge for safely deploying
machine learning models in the real world. Existing solutions are mainly driven by small …

Batchformer: Learning to explore sample relationships for robust representation learning

Z Hou, B Yu, D Tao - … of the IEEE/CVF Conference on …, 2022 - openaccess.thecvf.com
Despite the success of deep neural networks, there are still many challenges in deep
representation learning due to the data scarcity issues such as data imbalance, unseen …

Tafe-net: Task-aware feature embeddings for low shot learning

X Wang, F Yu, R Wang, T Darrell… - Proceedings of the …, 2019 - openaccess.thecvf.com
Learning good feature embeddings for images often requires substantial training data. As a
consequence, in settings where training data is limited (eg, few-shot and zero-shot learning) …

Well-classified examples are underestimated in classification with deep neural networks

G Zhao, W Yang, X Ren, L Li, Y Wu… - Proceedings of the AAAI …, 2022 - ojs.aaai.org
The conventional wisdom behind learning deep classification models is to focus on bad-
classified examples and ignore well-classified examples that are far from the decision …

Learning transferable negative prompts for out-of-distribution detection

T Li, G Pang, X Bai, W Miao… - Proceedings of the IEEE …, 2024 - openaccess.thecvf.com
Existing prompt learning methods have shown certain capabilities in Out-of-Distribution
(OOD) detection but the lack of OOD images in the target dataset in their training can lead to …

Long-tailed visual recognition via self-heterogeneous integration with knowledge excavation

Y Jin, M Li, Y Lu, Y Cheung… - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
Deep neural networks have made huge progress in the last few decades. However, as the
real-world data often exhibits a long-tailed distribution, vanilla deep models tend to be …