A robust least squares support vector machine for regression and classification with noise

X Yang, L Tan, L He - Neurocomputing, 2014 - Elsevier
X Yang, L Tan, L He
Neurocomputing, 2014Elsevier
Least squares support vector machines (LS-SVMs) are sensitive to outliers or noise in the
training dataset. Weighted least squares support vector machines (WLS-SVMs) can partly
overcome this shortcoming by assigning different weights to different training samples.
However, it is a difficult task for WLS-SVMs to set the weights of the training samples, which
greatly influences the robustness of WLS-SVMs. In order to avoid setting weights, in this
paper, a novel robust LS-SVM (RLS-SVM) is presented based on the truncated least …
Abstract
Least squares support vector machines (LS-SVMs) are sensitive to outliers or noise in the training dataset. Weighted least squares support vector machines (WLS-SVMs) can partly overcome this shortcoming by assigning different weights to different training samples. However, it is a difficult task for WLS-SVMs to set the weights of the training samples, which greatly influences the robustness of WLS-SVMs. In order to avoid setting weights, in this paper, a novel robust LS-SVM (RLS-SVM) is presented based on the truncated least squares loss function for regression and classification with noise. Based on its equivalent model, we theoretically analyze the reason why the robustness of RLS-SVM is higher than that of LS-SVMs and WLS-SVMs. In order to solve the proposed RLS-SVM, we propose an iterative algorithm based on the concave–convex procedure (CCCP) and the Newton algorithm. The statistical tests of the experimental results conducted on fourteen benchmark regression datasets and ten benchmark classification datasets show that compared with LS-SVMs, WLS-SVMs and iteratively reweighted LS-SVM (IRLS-SVM), the proposed RLS-SVM significantly reduces the effect of the noise in the training dataset and provides superior robustness.
Elsevier
以上显示的是最相近的搜索结果。 查看全部搜索结果