作者
Xiyu Yu, Tongliang Liu, Mingming Gong, Dacheng Tao
发表日期
2018
研讨会论文
Proceedings of the European Conference on Computer Vision (ECCV)
页码范围
68-83
简介
In this paper, we study the classification problem in which we have access to easily obtainable surrogate for true labels, namely complementary labels, which specify classes that observations do extbf {not} belong to. Let and be the true and complementary labels, respectively. We first model the annotation of complementary labels via transition probabilities , where is the number of classes. Previous methods implicitly assume that , are identical, which is not true in practice because humans are biased toward their own experience. For example, as shown in Figure ef {complementary_label_cases}, if an annotator is more familiar with monkeys than prairie dogs when providing complementary labels for meerkats, she is more likely to employ``monkey''as a complementary label. We therefore reason that the transition probabilities will be different. In this paper, we propose a framework that contributes three main innovations to learning with extbf {biased} complementary labels:(1) It estimates transition probabilities with no bias.(2) It provides a general method to modify traditional loss functions and extends standard deep neural network classifiers to learn with biased complementary labels.(3) It theoretically ensures that the classifier learned with complementary labels converges to the optimal one learned with true labels. Comprehensive experiments on several benchmark datasets validate the superiority of our method to current state-of-the-art methods.
引用总数
20182019202020212022202320247102839355134
学术搜索中的文章
X Yu, T Liu, M Gong, D Tao - Proceedings of the European conference on computer …, 2018