Article ID Journal Published Year Pages File Type
534721 Pattern Recognition Letters 2012 9 Pages PDF
Abstract

The main advantage of kernel methods stems from the implicit transformation of patterns to a high-dimensional feature space, thus a choice of a kernel function and proper setting of its parameters is of crucial importance. Learning a kernel from the data requires evaluation measures to assess the quality of the kernel. In this paper current state-of-the-art kernel evaluation measures are examined and their application to the kernel optimization is verified, showing limitations of these methods. As a result, alternative evaluation measures are proposed that strive to overcome these disadvantages. Results of experiments are provided to demonstrate that the application of the optimization process that leverages introduced measures results in kernels that correspond to the classifiers that achieve significantly lower error rate.

► State-of-the-art kernel evaluation measures exhibit severe limitations. ► Three alternative kernel evaluation measures are presented. ► Alternative evaluation measures overcome disadvantages of standard measures. ► SVM with kernel optimized using alternative measures reveals higher accuracy. ► Proposed methodology enables effective kernel tuning without building classifier.

Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
,