Article ID Journal Published Year Pages File Type
9653359 Neurocomputing 2005 15 Pages PDF
Abstract
The design of structures and algorithms for non-MAP multiclass decision problems is discussed in this paper. We propose a parametric family of loss functions that provides accurate estimates for the posterior class probabilities near the decision regions. Moreover, we discuss learning algorithms based on the stochastic gradient minimization of these loss functions. We show that these algorithms behave like sample selectors: samples near the decision regions are the most relevant during learning. Moreover, it is shown that these loss functions can be seen as an alternative to support vector machines (SVM) classifiers for low-dimensional feature spaces. Experimental results on some real data sets are also provided to show the effectiveness of this approach versus the classical cross entropy (based on a global posterior probability estimation).
Keywords
Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , ,