Article ID Journal Published Year Pages File Type
409157 Neurocomputing 2008 11 Pages PDF
Abstract

In this contribution, we introduce a new on-line approximate maximal margin learning algorithm based on an extension of the perceptron algorithm. This extension, which we call fixed margin perceptron (FMP), finds the solution of a linearly separable learning problem given a fixed margin. It is shown that this algorithm converges in (R2-γf2)/(γ*-γf)2 updates, where γf<γ*γf<γ* is the fixed margin, γ*γ* is the optimum margin and R is the radius of the ball that circumscribes the data. The incremental margin algorithm (IMA) approximates the large margin solution by successively using FMP with increasing margin values. This incremental approach always guarantees a good solution at hands. Also, it is easy to implement and avoids quadratic programming methods. IMA was tested using several different data sets and it yields results similar to those found by an SVM.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, ,