Article ID Journal Published Year Pages File Type
403871 Neural Networks 2015 11 Pages PDF
Abstract

The νν-Support Vector Regression (νν-SVR) is an effective regression learning algorithm, which has the advantage of using a parameter νν on controlling the number of support vectors and adjusting the width of the tube automatically. However, compared to νν-Support Vector Classification (νν-SVC) (Schölkopf et al., 2000), νν-SVR introduces an additional linear term into its objective function. Thus, directly applying the accurate on-line νν-SVC algorithm (AONSVM) to νν-SVR will not generate an effective initial solution. It is the main challenge to design an incremental νν-SVR learning algorithm. To overcome this challenge, we propose a special procedure called initial adjustments   in this paper. This procedure adjusts the weights of νν-SVC based on the Karush–Kuhn–Tucker (KKT) conditions to prepare an initial solution for the incremental learning. Combining the initial adjustments   with the two steps of AONSVM produces an exact and effective incremental νν-SVR learning algorithm (INSVR). Theoretical analysis has proven the existence of the three key inverse matrices, which are the cornerstones of the three steps of INSVR (including the initial adjustments  ), respectively. The experiments on benchmark datasets demonstrate that INSVR can avoid the infeasible updating paths as far as possible, and successfully converges to the optimal solution. The results also show that INSVR is faster than batch νν-SVR algorithms with both cold and warm starts.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , , , ,