کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
534776 | 870288 | 2012 | 12 صفحه PDF | دانلود رایگان |
Support vector machine (SVM) was initially designed for binary classification. To extend SVM to the multi-class scenario, a number of classification models were proposed such as the one by Crammer and Singer (2001). However, the number of variables in Crammer and Singer’s dual problem is the product of the number of samples (l) by the number of classes (k), which produces a large computational complexity. This paper presents a simplified multi-class SVM (SimMSVM) that reduces the size of the resulting dual problem from l × k to l by introducing a relaxed classification error bound. The experimental results demonstrate that the proposed SimMSVM approach can greatly speed-up the training process, while maintaining a competitive classification accuracy.
► This paper gives a simplified multi-class SVM by introducing a relaxed error bound.
► The method reduces the size of the resulting dual problem from l × k to l.
► The experiments demonstrate that the proposed method speeds up the training process.
► Meanwhile it maintains a competitive classification accuracy.
Journal: Pattern Recognition Letters - Volume 33, Issue 1, 1 January 2012, Pages 71–82