Sparse learning for support vector classification

K Huang, D Zheng, J Sun, Y Hotta, K Fujimoto… - Pattern Recognition …, 2010 - Elsevier
K Huang, D Zheng, J Sun, Y Hotta, K Fujimoto, S Naoi
Pattern Recognition Letters, 2010Elsevier
This paper provides a sparse learning algorithm for Support Vector Classification (SVC),
called Sparse Support Vector Classification (SSVC), which leads to sparse solutions by
automatically setting the irrelevant parameters exactly to zero. SSVC adopts the L0-norm
regularization term and is trained by an iteratively reweighted learning algorithm. We show
that the proposed novel approach contains a hierarchical-Bayes interpretation. Moreover,
this model can build up close connections with some other sparse models. More specifically …
This paper provides a sparse learning algorithm for Support Vector Classification (SVC), called Sparse Support Vector Classification (SSVC), which leads to sparse solutions by automatically setting the irrelevant parameters exactly to zero. SSVC adopts the L0-norm regularization term and is trained by an iteratively reweighted learning algorithm. We show that the proposed novel approach contains a hierarchical-Bayes interpretation. Moreover, this model can build up close connections with some other sparse models. More specifically, one variation of the proposed method is equivalent to the zero-norm classifier proposed in (Weston et al., 2003); it is also an extended and more flexible framework in parallel with the Sparse Probit Classifier proposed by Figueiredo (2003). Theoretical justifications and experimental evaluations on two synthetic datasets and seven benchmark datasets show that SSVC offers competitive performance to SVC but needs significantly fewer Support Vectors.
Elsevier
以上显示的是最相近的搜索结果。 查看全部搜索结果