Article ID Journal Published Year Pages File Type
536521 Pattern Recognition Letters 2011 5 Pages PDF
Abstract

We reformulate the Quadratic Programming Feature Selection (QPFS) method in a Kernel space to obtain a vector which maximizes the quadratic objective function of QPFS. We demonstrate that the vector obtained by Kernel Quadratic Programming Feature Selection is equivalent to the Kernel Fisher vector and, therefore, a new interpretation of the Kernel Fisher discriminant analysis is given which provides some computational advantages for highly unbalanced datasets.

► Kernelization of the quadratic programming feature selection (QPFS) algorithm. ► Proof of the equivalence with Kernel Fisher discriminant (KFD). ► New solution and interpretation of the KFD direction. ► More efficient computation of KFD vector when the classes are highly unbalanced.

Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
, , ,