Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
404909 | Neural Networks | 2006 | 15 Pages |
We propose a novel algorithm, Terminated Ramp–Support Vector Machines (TR–SVM), for classification and feature ranking purposes in the family of Support Vector Machines. The main improvement relies on the fact that the kernel is automatically determined by the training examples. It is built as a function of simple classifiers, generalized terminated ramp functions, obtained by separating oppositely labeled pairs of training points. The algorithm has a meaningful geometrical interpretation, and it is derived in the framework of Tikhonov regularization theory. Its unique free parameter is the regularization one, representing a trade-off between empirical error and solution complexity. Employing the equivalence between the proposed algorithm and two-layer networks, a theoretical bound on the generalization error is also derived, together with Vapnik–Chervonenkis dimension. Performances are tested on a number of synthetic and real data sets.