Article ID Journal Published Year Pages File Type
6938639 Pattern Recognition 2018 29 Pages PDF
Abstract
Orthogonal polynomial kernels have been recently introduced to enhance support vector machine classifiers by reducing their number of support vectors. Previous works have studied these kernels as isolated cases and discussed only particular aspects. In this paper, a novel formulation of orthogonal polynomial kernels that includes and improves previous proposals (Legendre, Chebyshev and Hermite) is presented. Two undesired effects that must be avoided in order to use orthogonal polynomial kernels are identified and resolved: the Annihilation and the Explosion effects. The proposed formulation is studied by means of introducing a new family of orthogonal polynomial kernels based on Gegenbauer polynomials and comparing it against other kernels. Experimental results reveal that the Gegenbauer family competes with the RBF kernel in accuracy while requiring fewer support vectors and overcomes other classical and orthogonal kernels.
Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
, , , , ,