Article ID Journal Published Year Pages File Type
430356 Journal of Computational Science 2015 8 Pages PDF
Abstract

•We propose a new algorithm which aims to solve binary classification.•LPTWSVM improve the recently proposed ITSVM and implement structure risk minimization.•We do not need to compute the large inverse matrices or use any optimization trick since the primal problems are linear programs.•We can introduce kernel function directly into nonlinear case.•We extend the new algorithm to multi-class classification and make effective classification.

This paper propose a new algorithm, termed as LPTWSVM, for binary classification problem by seeking two nonparallel hyperplanes which is an improved method for TWSVM. We improve the recently proposed ITSVM and develop Generalized ITSVM. A linear function is chosen in the object function of Generalized ITSVM which leads to the primal problems of LPTWSVM. Comparing with TWSVM, a 1-norm regularization term is introduced to the objective function to implement structural risk minimization and the quadratic programming problems are changed to linear programming problems which can be solved fast and easily. Then we do not need to compute the large inverse matrices or use any optimization trick in solving our linear programs and the dual problems are unnecessary in the paper. We can introduce kernel function directly into nonlinear case which overcome the serious drawback of TWSVM. Also, we extend LPTWSVM to multi-class classification problem and get a new model MLPTWSVM. MLPTWSVM constructs M hyperplanes to make that the m-th hyperplane is far from the m-th class and close to the rest classes as much as possible which follow the idea of MBSVM. The numerical experiments verify that our new algorithms are very effective.

Related Topics
Physical Sciences and Engineering Computer Science Computational Theory and Mathematics
Authors
, ,