Article ID Journal Published Year Pages File Type
408616 Neurocomputing 2007 7 Pages PDF
Abstract

We propose a novel and fast algorithm to train support vector machines (SVMs) in primal space, which solves an approximate optimization of SVMs with the properties of unconstraint, continuity and twice differentiability by utilizing the Newton optimization technique. Further, we devise a special pre-extracting procedure to speed up the convergence of the algorithm by resorting to a high-quality initial solution. Theoretical studies show that the proposed algorithm produces an ɛɛ-approximate solution to standard SVMs and maintains low computational complexity. Experimental results on benchmark data sets demonstrate that our algorithm is much faster than the dual based method such as SVMlightSVMlight while it achieves the similar test accuracy.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , ,