Article ID Journal Published Year Pages File Type
533469 Pattern Recognition 2012 12 Pages PDF
Abstract

Motivated by the potential field of static electricity, a binary potential function classifier views each training sample as an electrical charge, positive or negative according to its class label. The resulting potential field divides the feature space into two decision regions based on the polarity of the potential. In this paper, we revisit potential function classifiers in their original form and reveal their connections with other well-known results in the literature. We derive a bound on the generalization performance of multiclass potential function classifiers based on the observed margin distribution of the training data. A new model selection criterion using a normalized margin distribution is then proposed to learn “good” potential function classifiers in practice.

► Reveal connections potential function rules (PFRs) with the Bayes decision theory. ► Derive a generalization bound on the performance of PFRs using margin distribution. ► Propose a model selection criterion for PFRs using a normalized margin distribution.

Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
, , ,