Article ID Journal Published Year Pages File Type
534776 Pattern Recognition Letters 2012 12 Pages PDF
Abstract

Support vector machine (SVM) was initially designed for binary classification. To extend SVM to the multi-class scenario, a number of classification models were proposed such as the one by Crammer and Singer (2001). However, the number of variables in Crammer and Singer’s dual problem is the product of the number of samples (l) by the number of classes (k), which produces a large computational complexity. This paper presents a simplified multi-class SVM (SimMSVM) that reduces the size of the resulting dual problem from l × k to l by introducing a relaxed classification error bound. The experimental results demonstrate that the proposed SimMSVM approach can greatly speed-up the training process, while maintaining a competitive classification accuracy.

► This paper gives a simplified multi-class SVM by introducing a relaxed error bound. ► The method reduces the size of the resulting dual problem from l × k to l. ► The experiments demonstrate that the proposed method speeds up the training process. ► Meanwhile it maintains a competitive classification accuracy.

Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
, , , , ,