[PDF][PDF] Ensemble learning

M Sewell - RN, 2008 - academia.edu
This note presents a chronological review of the literature on ensemble learning which has
accumulated over the past twenty years. The idea of ensemble learning is to employ multiple …

Evaluation of classifiers for an uneven class distribution problem

S Daskalaki, I Kopanas, N Avouris - Applied artificial intelligence, 2006 - Taylor & Francis
Classification problems with uneven class distributions present several difficulties during the
training as well as during the evaluation process of classifiers. A classification problem with …

Dynamic classifier ensemble model for customer classification with imbalanced class distribution

J Xiao, L Xie, C He, X Jiang - Expert Systems with Applications, 2012 - Elsevier
Customer classification is widely used in customer relationship management including
churn prediction, credit scoring, cross-selling and so on. In customer classification, an …

Is independence good for combining classifiers?

LI Kuncheva, CJ Whitaker, CA Shipp… - … Conference on Pattern …, 2000 - ieeexplore.ieee.org
Independence between individual classifiers is typically viewed as an asset in classifier
fusion. We study the limits on the majority vote accuracy when combining dependent …

Soft combination of neural classifiers: A comparative study

A Verikas, A Lipnickas, K Malmqvist… - Pattern recognition …, 1999 - Elsevier
This paper presents four schemes for soft fusion of the outputs of multiple classifiers. In the
first three approaches, the weights assigned to the classifiers or groups of them are data …

[PDF][PDF] Robust speech recognition using articulatory information

K Kirchho - PhD esis, University of Bielefeld, Bielefeld, Germany, 1999 - Citeseer
Whereas most state-of-the-art speech recognition systems use spectral or cepstral
representations of the speech signal, there have also been some promising attempts at …

Clustering-and-selection model for classifier combination

LI Kuncheva - KES'2000. Fourth International Conference on …, 2000 - ieeexplore.ieee.org
We devise a simple clustering-and-selection algorithm based on a probabilistic
interpretation of classifier selection. First, the data set is clustered into K clusters, and then …

[PDF][PDF] New algorithms for efficient high-dimensional nonparametric classification.

T Liu, AW Moore, A Gray, C Cardie - Journal of machine learning research, 2006 - jmlr.org
This paper is about non-approximate acceleration of high-dimensional nonparametric
operations such as k nearest neighbor classifiers. We attempt to exploit the fact that even if …

Random sampling for subspace face recognition

X Wang, X Tang - International Journal of Computer Vision, 2006 - Springer
Subspace face recognition often suffers from two problems:(1) the training sample set is
small compared with the high dimensional feature vector;(2) the performance is sensitive to …

Texture classification using dominant neighborhood structure

FM Khellah - IEEE Transactions on image processing, 2011 - ieeexplore.ieee.org
This paper proposes a new approach to extract global image features for the purpose of
texture classification. The proposed texture features are obtained by generating an …