Article ID Journal Published Year Pages File Type
468869 Computers & Mathematics with Applications 2011 5 Pages PDF
Abstract

In this paper, a generalization of support vector machines is explored where it is considered that input vectors have different ℓpℓp norms for each class. It is proved that the optimization problem for binary classification by using the maximal margin principle with ℓpℓp and ℓqℓq norms only depends on the ℓpℓp norm if 1≤p≤q1≤p≤q. Furthermore, the selection of a different bias in the classifier function is a consequence of the ℓqℓq norm in this approach. Some commentaries on the most commonly used approaches of SVM are also given as particular cases.

Related Topics
Physical Sciences and Engineering Computer Science Computer Science (General)
Authors
, , , ,