کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
534721 | 870283 | 2012 | 9 صفحه PDF | دانلود رایگان |
The main advantage of kernel methods stems from the implicit transformation of patterns to a high-dimensional feature space, thus a choice of a kernel function and proper setting of its parameters is of crucial importance. Learning a kernel from the data requires evaluation measures to assess the quality of the kernel. In this paper current state-of-the-art kernel evaluation measures are examined and their application to the kernel optimization is verified, showing limitations of these methods. As a result, alternative evaluation measures are proposed that strive to overcome these disadvantages. Results of experiments are provided to demonstrate that the application of the optimization process that leverages introduced measures results in kernels that correspond to the classifiers that achieve significantly lower error rate.
► State-of-the-art kernel evaluation measures exhibit severe limitations.
► Three alternative kernel evaluation measures are presented.
► Alternative evaluation measures overcome disadvantages of standard measures.
► SVM with kernel optimized using alternative measures reveals higher accuracy.
► Proposed methodology enables effective kernel tuning without building classifier.
Journal: Pattern Recognition Letters - Volume 33, Issue 9, 1 July 2012, Pages 1108–1116