Pulns: Positive-unlabeled learning with effective negative sample selector

C Luo, P Zhao, C Chen, B Qiao, C Du… - Proceedings of the …, 2021 - ojs.aaai.org
Positive-unlabeled learning (PU learning) is an important case of binary classification where
the training data only contains positive and unlabeled samples. The current state-of-the-art …

Pue: Biased positive-unlabeled learning enhancement by causal inference

X Wang, H Chen, T Guo… - Advances in Neural …, 2024 - proceedings.neurips.cc
Positive-Unlabeled (PU) learning aims to achieve high-accuracy binary classification with
limited labeled positive examples and numerous unlabeled ones. Existing cost-sensitive …

Modeling user attention in music recommendation

S Dai, N Shao, J Zhu, X Zhang, Z Dong… - 2024 IEEE 40th …, 2024 - ieeexplore.ieee.org
With the popularity of online music services, personalized music recommendation has
garnered much research interest. Recommendation models are typically trained on datasets …

Birds of an odd feather: guaranteed out-of-distribution (OOD) novel category detection

Y Wald, S Saria - Uncertainty in Artificial Intelligence, 2023 - proceedings.mlr.press
In this work, we solve the problem of novel category detection under distribution shift. This
problem is critical to ensuring the safety and efficacy of machine learning models …

Recovering the propensity score from biased positive unlabeled data

W Gerych, T Hartvigsen, L Buquicchio, E Agu… - Proceedings of the …, 2022 - ojs.aaai.org
Positive-Unlabeled (PU) learning methods train a classifier to distinguish between the
positive and negative classes given only positive and unlabeled data. While traditional PU …

Regression with sensor data containing incomplete observations

T Katsuki, T Osogami - International Conference on Machine …, 2023 - proceedings.mlr.press
This paper addresses a regression problem in which output label values are the results of
sensing the magnitude of a phenomenon. A low value of such labels can mean either that …

Positive distribution pollution: rethinking positive unlabeled learning from a unified perspective

Q Liang, M Zhu, Y Wang, X Wang, W Zhao… - Proceedings of the …, 2023 - ojs.aaai.org
Positive Unlabeled (PU) learning, which has a wide range of applications, is becoming
increasingly prevalent. However, it suffers from problems such as data imbalance, selection …

Bootstrap Latent Prototypes for graph positive-unlabeled learning

C Liang, Y Tian, D Zhao, M Li, S Pan, H Zhang, J Wei - Information Fusion, 2024 - Elsevier
Graph positive-unlabeled (GPU) learning aims to learn binary classifiers from only positive
and unlabeled (PU) nodes. The state-of-the-art methods rely on provided class prior …

Modeling PU learning using probabilistic logic programming

V Verreet, L De Raedt, J Bekker - Machine Learning, 2024 - Springer
The goal of learning from positive and unlabeled (PU) examples is to learn a classifier that
predicts the posterior class probability. The challenge is that the available labels in the data …

Robust recurrent classifier chains for multi-label learning with missing labels

W Gerych, T Hartvigsen, L Buquicchio, E Agu… - Proceedings of the 31st …, 2022 - dl.acm.org
Recurrent Classifier Chains (RCCs) are a leading approach for multi-label classification as
they directly model the interdependencies between classes. Unfortunately, existing RCCs …