Article ID Journal Published Year Pages File Type
6865910 Neurocomputing 2015 25 Pages PDF
Abstract
This paper introduces a wrapper method, namely cosine similarity measure support vector machines (CSMSVM), to eliminate irrelevant or redundant features during classifier construction by introducing the cosine distance into support vector machines (SVM). Traditionally, feature selection approaches typically extract features and learn SVM parameters independently or in the attribute space, which might result in a loss of information related to classification process or lead to the increase of classification error when introduce the kernel SVM. The proposed CSMSVM framework, however, jointly performs feature selection, SVM parameter learning and remove low relevance features by optimizing the shape of an anisotropic RBF kernel in feature space. Moreover, the Bayesian interpretation of the novel methodology reveals its Bayesian character, which builds the proposed method on solid theory foundation, and the iteration algorithm, which is proposed to optimize the feature weight, has achieved to maximize the maximum a posterior (MAP). Comparing the novel method with well-known feature selection techniques with experiments, CSMSVM outperformed the other methodologies in improving the pattern recognition accuracy with fewer features.
Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, ,