A novel wrapper method for feature selection and its applications

G Chen, J Chen - Neurocomputing, 2015 - Elsevier
Neurocomputing, 2015Elsevier
This paper introduces a wrapper method, namely cosine similarity measure support vector
machines (CSMSVM), to eliminate irrelevant or redundant features during classifier
construction by introducing the cosine distance into support vector machines (SVM).
Traditionally, feature selection approaches typically extract features and learn SVM
parameters independently or in the attribute space, which might result in a loss of
information related to classification process or lead to the increase of classification error …
Abstract
This paper introduces a wrapper method, namely cosine similarity measure support vector machines (CSMSVM), to eliminate irrelevant or redundant features during classifier construction by introducing the cosine distance into support vector machines (SVM). Traditionally, feature selection approaches typically extract features and learn SVM parameters independently or in the attribute space, which might result in a loss of information related to classification process or lead to the increase of classification error when introduce the kernel SVM. The proposed CSMSVM framework, however, jointly performs feature selection, SVM parameter learning and remove low relevance features by optimizing the shape of an anisotropic RBF kernel in feature space. Moreover, the Bayesian interpretation of the novel methodology reveals its Bayesian character, which builds the proposed method on solid theory foundation, and the iteration algorithm, which is proposed to optimize the feature weight, has achieved to maximize the maximum a posterior (MAP). Comparing the novel method with well-known feature selection techniques with experiments, CSMSVM outperformed the other methodologies in improving the pattern recognition accuracy with fewer features.
Elsevier
以上显示的是最相近的搜索结果。 查看全部搜索结果