作者
Xiaoming Liu, Jianwei Yin, Jinxiang Dong, Memon Abdul Ghafoor
发表日期
2005
研讨会论文
Advances in Web-Age Information Management: 6th International Conference, WAIM 2005, Hangzhou, China, October 11–13, 2005. Proceedings 6
页码范围
162-171
出版商
Springer Berlin Heidelberg
简介
Boosting is a method for supervised learning, which has successfully been applied to many different domains and has proven one of the best performers in text classification exercises so far. FloatBoost learning uses a backtrack mechanism after each iteration of AdaBoost learning to minimize the error rate directly, rather than minimizing an exponential function of the margin as in the traditional AdaBoost algorithm. This paper presents an improved FloatBoost boosting algorithm for boosting Naïve Bayes text classification, called DifBoost, which combines Divide and Conquer Principal with the FloatBoost algorithm. Integrating FloatBoost with the Divide and Conquer principal, DifBoost divides the input space into a few sub-spaces during training process and the final classifier is formed with the weighted combination of basic classifiers, where basic classifiers are affected by different sub-spaces differently …
引用总数
2009201020112012201320142015201612
学术搜索中的文章
X Liu, J Yin, J Dong, MA Ghafoor - Advances in Web-Age Information Management: 6th …, 2005