کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
388515 | 660926 | 2011 | 8 صفحه PDF | دانلود رایگان |

Support Vector Machine (SVM) achieves state-of-the-art performance in many real applications. A guarantee of its performance superiority is from the maximization of between-class margin, or loosely speaking, full use of discriminative information from between-class samples. While in this paper, we focus on not only such discriminative information from samples but also discrimination of individual features and develop feature discrimination incorporated SVM (FDSVM). Instead of minimizing the l2-norm of feature weight vector, or equivalently, imposing equal penalization on all weight components in SVM learning, FDSVM penalizes each weight by an amount decreasing with the corresponding feature discrimination measure, consequently features with better discrimination can be attached greater importance. Experiments on both toy and real UCI datasets demonstrate that FDSVM often achieves better performance with comparable efficiency.
Journal: Expert Systems with Applications - Volume 38, Issue 10, 15 September 2011, Pages 12506–12513