[图书][B] Lifelong machine learning

Z Chen, B Liu - 2022 - books.google.com
Lifelong Machine Learning, Second Edition is an introduction to an advanced machine
learning paradigm that continuously learns by accumulating past knowledge that it then …

Doc: Deep open classification of text documents

L Shu, H Xu, B Liu - arXiv preprint arXiv:1709.08716, 2017 - arxiv.org
Traditional supervised learning makes the closed-world assumption that the classes
appeared in the test data must have appeared in training. This also applies to text learning …

Covariate shift: A review and analysis on classifiers

NG Nair, P Satpathy, J Christopher - 2019 Global Conference …, 2019 - ieeexplore.ieee.org
Training and testing are the two phases of a supervised machine learning model. When
these models are trained, validated and tested, it is usually assumed that the test and train …

[PDF][PDF] Breaking the closed world assumption in text classification

G Fei, B Liu - Proceedings of the 2016 Conference of the North …, 2016 - aclanthology.org
Existing research on multiclass text classification mostly makes the closed world
assumption, which focuses on designing accurate classifiers under the assumption that all …

Towards a unified analysis of kernel-based methods under covariate shift

X Feng, X He, C Wang, C Wang… - Advances in Neural …, 2024 - proceedings.neurips.cc
Covariate shift occurs prevalently in practice, where the input distributions of the source and
target data are substantially different. Despite its practical importance in various learning …

A survey on open set recognition

A Mahdavi, M Carvalho - 2021 IEEE Fourth International …, 2021 - ieeexplore.ieee.org
Open Set Recognition (OSR) is about dealing with unknown situations that were not learned
by the models during training. In this paper, we provide a survey of existing works about …

Learning cumulatively to become more knowledgeable

G Fei, S Wang, B Liu - Proceedings of the 22nd ACM SIGKDD …, 2016 - dl.acm.org
In classic supervised learning, a learning algorithm takes a fixed training data of several
classes to build a classifier. In this paper, we propose to study a new problem, ie, building a …

Classification from positive, unlabeled and biased negative data

YG Hsieh, G Niu, M Sugiyama - International conference on …, 2019 - proceedings.mlr.press
In binary classification, there are situations where negative (N) data are too diverse to be
fully labeled and we often resort to positive-unlabeled (PU) learning in these scenarios …

Pulns: Positive-unlabeled learning with effective negative sample selector

C Luo, P Zhao, C Chen, B Qiao, C Du… - Proceedings of the …, 2021 - ojs.aaai.org
Positive-unlabeled learning (PU learning) is an important case of binary classification where
the training data only contains positive and unlabeled samples. The current state-of-the-art …

Openwgl: Open-world graph learning

M Wu, S Pan, X Zhu - 2020 IEEE international conference on …, 2020 - ieeexplore.ieee.org
In traditional graph learning tasks, such as node classification, learning is carried out in a
closed-world setting where the number of classes and their training samples are provided to …