Article ID Journal Published Year Pages File Type
406816 Neurocomputing 2013 14 Pages PDF
Abstract

It is well-known for AdaBoost to select out the optimal weak classifier with the least sample-weighted error rate, which might be suboptimal for minimizing the naïve error rate. In this paper, a novel variant of AdaBoost named OtBoost is proposed to learn optimal thresholded node classifiers for cascade face detector. In OtBoost, a two-stage weak classifier selection approach based on adaptive boosting framework is applied to minimize both the sample-weighted error rate and the optimal-thresholded multi-set class-weighted error rate. Besides, a new sample set called selection set is also applied to prevent overfitting on the training set. Several upright frontal cascade face detectors are learned, which shows that the OtBoost strong classifiers have much better convergence ability than the AdaBoost ones with the cost of slightly worse generalization ability. Some OtBoost based cascade face detectors have acceptable performance on the CMU+MIT upright frontal face test set.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , ,