کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
409157 | 679057 | 2008 | 11 صفحه PDF | دانلود رایگان |
In this contribution, we introduce a new on-line approximate maximal margin learning algorithm based on an extension of the perceptron algorithm. This extension, which we call fixed margin perceptron (FMP), finds the solution of a linearly separable learning problem given a fixed margin. It is shown that this algorithm converges in (R2-γf2)/(γ*-γf)2 updates, where γf<γ*γf<γ* is the fixed margin, γ*γ* is the optimum margin and R is the radius of the ball that circumscribes the data. The incremental margin algorithm (IMA) approximates the large margin solution by successively using FMP with increasing margin values. This incremental approach always guarantees a good solution at hands. Also, it is easy to implement and avoids quadratic programming methods. IMA was tested using several different data sets and it yields results similar to those found by an SVM.
Journal: Neurocomputing - Volume 71, Issues 7–9, March 2008, Pages 1550–1560