Y Li, H Han, S Shan, X Chen - Proceedings of the IEEE/CVF …, 2023 - openaccess.thecvf.com
Existing studies indicate that deep neural networks (DNNs) can eventually memorize the label noise. We observe that the memorization strength of DNNs towards each instance is …
In label-noise learning, the noise transition matrix, bridging the class posterior for noisy and clean data, has been widely exploited to learn statistically consistent classifiers. The …
Human-annotated labels are often prone to noise, and the presence of such noise will degrade the performance of the resulting deep neural network (DNN) models. Much of the …
The sample selection approach is popular in learning with noisy labels. The state-of-the-art methods train two deep networks simultaneously for sample selection, which aims to employ …
D Cheng, T Liu, Y Ning, N Wang… - Proceedings of the …, 2022 - openaccess.thecvf.com
In label-noise learning, estimating the transition matrix has attracted more and more attention as the matrix plays an important role in building statistically consistent classifiers …
Classical machine learning implicitly assumes that labels of the training data are sampled from a clean distribution, which can be too restrictive for real-world scenarios. However …
X Li, T Liu, B Han, G Niu… - … conference on machine …, 2021 - proceedings.mlr.press
In label-noise learning, the transition matrix plays a key role in building statistically consistent classifiers. Existing consistent estimators for the transition matrix have been …
Open-vocabulary instance segmentation aims at segmenting novel classes without mask annotations. It is an important step toward reducing laborious human supervision. Most …
X Xia, J Deng, W Bao, Y Du, B Han… - Proceedings of the …, 2023 - openaccess.thecvf.com
Multi-label classification aims to learn classification models from instances associated with multiple labels. It is pivotal to learn and utilize the label dependence among multiple labels …