Article ID Journal Published Year Pages File Type
535700 Pattern Recognition Letters 2013 7 Pages PDF
Abstract

•SVMs often contain many SVs, which reduce runtime speeds of decision functions.•The method removes redundant SVs in one iteration, greatly improving pruning speed.•The existence and uniqueness of the fast pruning coefficients are shown.•The nexus of primal and dual optimizations is illustrated geometrically.•The method can be applied to other kernel-based machines without modifications.

Support vector machines (SV machines, SVMs) often contain many SVs, which reduce runtime speeds of decision functions. To simplify the decision functions and improve SVM succinctness, the efforts to remove SVs in trained SVMs have been made. By meticulously designing some pruning coefficients and solving for the rest, this paper presents a simple method for fast removing superfluous SVs. The method empowers users to remove those SVs in a single iteration, thereby significantly enhancing the pruning speed of currently used methods, which remove the SVs one by one. The existence and uniqueness of the fast pruning coefficients are shown. The nexus of primal and dual optimizations is illustrated geometrically. The fast pruning method can also be applied to other kernel-based machines without any modifications. The computational complexity is discussed. Examples are given first and experiments on larger data sets demonstrate the effectiveness of the fast simplification method.

Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
, , , , , , , ,