Generalizing consistent multi-class classification with rejection to be compatible with arbitrary losses

Y Cao, T Cai, L Feng, L Gu, J Gu, B An… - Advances in neural …, 2022 - proceedings.neurips.cc
Abstract\emph {Classification with rejection}(CwR) refrains from making a prediction to avoid
critical misclassification when encountering test samples that are difficult to classify. Though …

Learning to find good models in RANSAC

D Barath, L Cavalli, M Pollefeys - Proceedings of the IEEE …, 2022 - openaccess.thecvf.com
Abstract We propose the Model Quality Network, MQ-Net in short, for predicting the quality,
eg the pose error of essential matrices, of models generated inside RANSAC. It replaces the …

Dual focal loss for calibration

L Tao, M Dong, C Xu - International Conference on Machine …, 2023 - proceedings.mlr.press
The use of deep neural networks in real-world applications require well-calibrated networks
with confidence scores that accurately reflect the actual probability. However, it has been …

Penalizing the hard example but not too much: A strong baseline for fine-grained visual classification

Y Liang, L Zhu, X Wang, Y Yang - IEEE Transactions on Neural …, 2022 - ieeexplore.ieee.org
Though significant progress has been achieved on fine-grained visual classification (FGVC),
severe overfitting still hinders model generalization. A recent study shows that hard samples …

Towards Calibrated Multi-label Deep Neural Networks

J Cheng, N Vasconcelos - … of the IEEE/CVF Conference on …, 2024 - openaccess.thecvf.com
The problem of calibrating deep neural networks (DNNs) for multi-label learning is
considered. It is well-known that DNNs trained by cross-entropy for single-label or one-hot …

Revisiting confidence estimation: Towards reliable failure prediction

F Zhu, XY Zhang, Z Cheng… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Reliable confidence estimation is a challenging yet fundamental requirement in many risk-
sensitive applications. However, modern deep neural networks are often overconfident for …

On calibration of graph neural networks for node classification

T Liu, Y Liu, M Hildebrandt, M Joblin… - … Joint Conference on …, 2022 - ieeexplore.ieee.org
Graphs can model real-world, complex systems by representing entities and their
interactions in terms of nodes and edges. To better exploit the graph structure, graph neural …

Sustainable Coffee Leaf Diagnosis: A Deep Knowledgeable Meta-Learning Approach

AA Salamai, WT Al-Nami - Sustainability, 2023 - mdpi.com
Multi-task visual recognition plays a pivotal role in addressing the composite challenges
encountered during the monitoring of crop health, pest infestations, and disease outbreaks …

A short survey on importance weighting for machine learning

M Kimura, H Hino - arXiv preprint arXiv:2403.10175, 2024 - arxiv.org
Importance weighting is a fundamental procedure in statistics and machine learning that
weights the objective function or probability distribution based on the importance of the …

Balanced Confidence Calibration for Graph Neural Networks

H Yang, M Wang, Q Wang, M Lao, Y Zhou - Proceedings of the 30th …, 2024 - dl.acm.org
This paper delves into the confidence calibration in prediction when using Graph Neural
Networks (GNNs), which has emerged as a notable challenge in the field. Despite their …