Article ID Journal Published Year Pages File Type
408889 Neurocomputing 2008 13 Pages PDF
Abstract

Since the loss function is so important in statistical learning, this paper proposes the concept of adding heavier penalties to the heterogeneous examples of a dataset to achieve a stricter convex loss function for optimization. The concept was realized by changing the class labels of support vector machines (SVM) into greater real values. Using the magnified real-valued class labels to convey the additional penalties, an elementary stage-wise classifier was developed to achieve a high training accuracy. In this article, the original theory and induced corresponding properties of the stage-wise classifier are presented for further applications. Two types of re-weighting rules were devised in the connection of consecutive stages to produce the heavier penalties. Compared to a qualified underlying prototype, the empirical results showed that the classification complexity of the proposed classifier was increased accordingly as the accuracy of the classifier was improved due to various additional penalties. Although the stricter penalties might cause an undesirable over-fitting, the flexible re-weighting strategy is still beneficial for some application.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
,