Article ID Journal Published Year Pages File Type
694550 Acta Automatica Sinica 2009 7 Pages PDF
Abstract

Asymmetry is inherent in tasks of object detection where rare positive targets need to be distinguished from enormous negative patterns. That is, to achieve a higher detection rate, the cost of missing a target should be higher than that of a false positive. Cost-sensitive learning is a suitable way for solving such problems. However, most cost-sensitive extensions of AdaBoost are realized by heuristically modifying the weights and confidence parameters of the discrete AdaBoost. It remains unclear whether there is a unified framework to interpret these methods as AdaBoost, clarify their relationships, and further derive the superior real-valued cost-sensitive boosting algorithms. In this paper, according to the three different upper bounds of the asymmetric training error, we not only give a detailed discussion about the various discrete asymmetric AdaBoost algorithms and their relationships, but also derive the real-valued asymmetric boosting algorithms in the form of additive logistic regression with analytical solutions, which are denoted by Asym-Real AdaBoost and Asym-Gentle AdaBoost. Experiments on both face detection and pedestrian detection demonstrate that the proposed approaches are efficient and achieve better performance than the previous AdaBoost methods and discrete asymmetric extensions.

Related Topics
Physical Sciences and Engineering Engineering Control and Systems Engineering