Multi-class classification without multi-class labels

YC Hsu, Z Lv, J Schlosser, P Odom, Z Kira - arXiv preprint arXiv …, 2019 - arxiv.org
arXiv preprint arXiv:1901.00544, 2019arxiv.org
This work presents a new strategy for multi-class classification that requires no class-specific
labels, but instead leverages pairwise similarity between examples, which is a weaker form
of annotation. The proposed method, meta classification learning, optimizes a binary
classifier for pairwise similarity prediction and through this process learns a multi-class
classifier as a submodule. We formulate this approach, present a probabilistic graphical
model for it, and derive a surprisingly simple loss function that can be used to learn neural …
This work presents a new strategy for multi-class classification that requires no class-specific labels, but instead leverages pairwise similarity between examples, which is a weaker form of annotation. The proposed method, meta classification learning, optimizes a binary classifier for pairwise similarity prediction and through this process learns a multi-class classifier as a submodule. We formulate this approach, present a probabilistic graphical model for it, and derive a surprisingly simple loss function that can be used to learn neural network-based models. We then demonstrate that this same framework generalizes to the supervised, unsupervised cross-task, and semi-supervised settings. Our method is evaluated against state of the art in all three learning paradigms and shows a superior or comparable accuracy, providing evidence that learning multi-class classification without multi-class labels is a viable learning option.
arxiv.org
以上显示的是最相近的搜索结果。 查看全部搜索结果