کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
6938639 | 1449963 | 2018 | 29 صفحه PDF | دانلود رایگان |
عنوان انگلیسی مقاله ISI
A novel formulation of orthogonal polynomial kernel functions for SVM classifiers: The Gegenbauer family
دانلود مقاله + سفارش ترجمه
دانلود مقاله ISI انگلیسی
رایگان برای ایرانیان
کلمات کلیدی
موضوعات مرتبط
مهندسی و علوم پایه
مهندسی کامپیوتر
چشم انداز کامپیوتر و تشخیص الگو
پیش نمایش صفحه اول مقاله
![عکس صفحه اول مقاله: A novel formulation of orthogonal polynomial kernel functions for SVM classifiers: The Gegenbauer family A novel formulation of orthogonal polynomial kernel functions for SVM classifiers: The Gegenbauer family](/preview/png/6938639.png)
چکیده انگلیسی
Orthogonal polynomial kernels have been recently introduced to enhance support vector machine classifiers by reducing their number of support vectors. Previous works have studied these kernels as isolated cases and discussed only particular aspects. In this paper, a novel formulation of orthogonal polynomial kernels that includes and improves previous proposals (Legendre, Chebyshev and Hermite) is presented. Two undesired effects that must be avoided in order to use orthogonal polynomial kernels are identified and resolved: the Annihilation and the Explosion effects. The proposed formulation is studied by means of introducing a new family of orthogonal polynomial kernels based on Gegenbauer polynomials and comparing it against other kernels. Experimental results reveal that the Gegenbauer family competes with the RBF kernel in accuracy while requiring fewer support vectors and overcomes other classical and orthogonal kernels.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Pattern Recognition - Volume 84, December 2018, Pages 211-225
Journal: Pattern Recognition - Volume 84, December 2018, Pages 211-225
نویسندگان
Luis Carlos Padierna, MartÃn Carpio, Alfonso Rojas-DomÃnguez, Héctor Puga, Héctor Fraire,