Article ID Journal Published Year Pages File Type
409360 Neurocomputing 2007 6 Pages PDF
Abstract

Support vector machines (SVMs) are known to result in a quadratic programming problem, that requires a large computational complexity. To reduce it, this paper considers, from the geometrical point of view, two incremental or iterative SVMs with homogeneous hyperplanes. One method is shown to produce the same solution as an SVM in batch mode with the linear complexity on average, utilizing the fact that only effective examples are necessary and sufficient for the solution. The other, which stores the set of support vectors instead of effective examples, is quantitatively shown to have a lower performance although implementation is rather easy.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, ,