Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
391529 | Information Sciences | 2015 | 16 Pages |
This paper presents a new approach to solving the optimization task that arises when L2-SVM in its primal form is considered. In particular, we propose the application of a Barzilai–Borwein (BB) update step in five variants for the classic Stochastic Gradient Descent (SGD) algorithm. The evaluation is designed to check the effectiveness of the proposed methods in large scale scenarios in terms of execution time, convergence and sensitivity to the choice of initial parameters. The obtained results are compared with those obtained for well-known linear SVM algorithms and they indicate that the level of convergence of the proposed methods is very similar to that found in the other studies. Moreover, our approach shows much lower sensitivity to the choice of initial parameters, which allows for a substantial reduction of pre-processing.