Article ID Journal Published Year Pages File Type
408076 Neurocomputing 2011 8 Pages PDF
Abstract

During the last few years, nonparallel plane classifiers, such as Multisurface Proximal Support Vector Machine via Generalized Eigenvalues (GEPSVM), and Least Squares TWSVM (LSTSVM), have attracted much attention. However, there are not any modifications of them that have been presented to automatically select the input features. This motivates the rush towards new classifiers. In this paper, we develop a new nonparallel plane classifier, which is designed for automatically selecting the relevant features. We first introduce a Tikhonov regularization (TR) term that is usually used for regularizing least squares into the LSTSVM learning framework, and then convert this formulation to a linear programming (LP) problem. By minimizing an exterior penalty (EP) problem of the dual of the LP formulation and using a fast generalized Newton algorithm, our method yields very sparse solutions, such that it generates a classifier that depends on only a smaller number of input features. In other words, this approach is capable of suppressing input features. This makes the classifier easier to store and faster to compute in the classification phase. Lastly, experiments on both toy and real problems disclose the effectiveness of our method.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , ,