کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
407312 | 678137 | 2012 | 11 صفحه PDF | دانلود رایگان |

Least squares support vector machine for regression (LSSVR) is an efficient method for function estimation problem. However, its solution is prone to large noise and outliers since it depends on the minimum of the sum of squares error (SSE) on training samples. To tackle this problem, in this paper, a novel regression model termed as recursive robust LSSVR (R2LSSVR) is proposed to obtain robust estimation for data in the presence of outliers. The idea is to build a regression model in the kernel space based on maximum correntropy criterion and regularization technique. An iterative algorithm derived from half-quadratic optimization is further developed to solve R2LSSVR with theoretically guaranteed convergence. It also reveals that R2LSSVR is closely related to the original LSSVR since it essentially solves adaptive weighted LSSVR iteratively. Furthermore, a hyperparameters selection method for R2LSSVR is presented based on particle swarm optimization (PSO) such that multiple hyperparameters in R2LSSVR can be estimated effectively for better performance. The feasibility of this method is examined on some simulated and benchmark datasets. The experimental results demonstrate the good robust performance of the proposed method.
Journal: Neurocomputing - Volume 97, 15 November 2012, Pages 63–73