Article ID Journal Published Year Pages File Type
6863236 Neural Networks 2016 26 Pages PDF
Abstract
Nonparallel Support Vector Machine (NPSVM) which is more flexible and has better generalization than typical SVM is widely used for classification. Although some methods and toolboxes like SMO and libsvm for NPSVM are used, NPSVM is hard to scale up when facing millions of samples. In this paper, we propose a divide-and-combine method for large scale nonparallel support vector machine (DCNPSVM). In the division step, DCNPSVM divide samples into smaller sub-samples aiming at solving smaller subproblems independently. We theoretically and experimentally prove that the objective function value, solutions, and support vectors solved by DCNPSVM are close to the objective function value, solutions, and support vectors of the whole NPSVM problem. In the combination step, the sub-solutions combined as initial iteration points are used to solve the whole problem by global coordinate descent which converges quickly. In order to balance the accuracy and efficiency, we adopt a multi-level structure which outperforms state-of-the-art methods. Moreover, our DCNPSVM can tackle unbalance problems efficiently by tuning the parameters. Experimental results on lots of large data sets show the effectiveness of our method in memory usage, classification accuracy and time consuming.
Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , ,