作者
Kaizhu Huang, Haiqin Yang, Irwin King, Michael R Lyu, Laiwan Chan
发表日期
2004
期刊
Journal of Machine Learning Research
卷号
5
期号
Oct
页码范围
1253-1286
简介
We construct a distribution-free Bayes optimal classifier called the Minimum Error Minimax Probability Machine (MEMPM) in a worst-case setting, ie, under all possible choices of class-conditional densities with a given mean and covariance matrix. By assuming no specific distributions for the data, our model is thus distinguished from traditional Bayes optimal approaches, where an assumption on the data distribution is a must. This model is extended from the Minimax Probability Machine (MPM), a recently-proposed novel classifier, and is demonstrated to be the general case of MPM. Moreover, it includes another special case named the Biased Minimax Probability Machine, which is appropriate for handling biased classification. One appealing feature of MEMPM is that it contains an explicit performance indicator, ie, a lower bound on the worst-case accuracy, which is shown to be tighter than that of MPM. We provide conditions under which the worst-case Bayes optimal classifier converges to the Bayes optimal classifier. We demonstrate how to apply a more general statistical framework to estimate model input parameters robustly. We also show how to extend our model to nonlinear classification by exploiting kernelization techniques. A series of experiments on both synthetic data sets and real world benchmark data sets validates our proposition and demonstrates the effectiveness of our model.
引用总数
200320042005200620072008200920102011201220132014201520162017201820192020202120222023202412381211829644392315138732
学术搜索中的文章
K Huang, H Yang, I King, MR Lyu, L Chan - Journal of Machine Learning Research, 2004