Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
531820 | Pattern Recognition | 2016 | 15 Pages |
•A new kernel function has been proposed by using Hermite orthogonal polynomial.•New combined kernels resulted by combining Hermite and common kernels have been proposed.•The proposed methods have been compared with three common kernels on real and synthetic data sets.•Hermite kernel function has the lowest required time for classification.•The proposed methods yield better error rate and they have the best support vectors reduction significantly.
Support Vector Machine is a desired method for classification of different types of data, but the main obstacle to using this method is the considerable reduction of classification speed upon increase in the size of problem. In this paper, a new kernel function is proposed for SVM which is derived from Hermite orthogonal polynomials. This function improves classification accuracy as well as reduction in the number of support vectors and increases the classification speed. The overall kernel performance has been evaluated on real world data sets from UCI repository by the ten-fold cross-validation method. The combinations of Hermite function with common kernels are proposed in this paper. Experimental results reveal that the Hermite–Chebyshev kernel which is obtained from combination of Hermite and Chebyshev kernel, has the best performance in the number of support vectors and Hermite–Gussian kernel that is produced from combination Hermite and Gussian kernel, has the first rank in the error rate among all experimental kernels. On the other hand the Hermite proposed method has the least number of support vectors and best performance in the error rate in comparison with common kernels, and lowest required time among all kernels.