Article ID Journal Published Year Pages File Type
534466 Pattern Recognition Letters 2010 11 Pages PDF
Abstract

Despite of its great success, two key problems are still unresolved for AdaBoost algorithms: how to select the most discriminative weak learners and how to optimally combine them. In this paper, a new AdaBoost algorithm is proposed to make improvement in the two aspects. First, we select the most discriminative weak learners by minimizing a novel distance related criterion, i.e., error-degree-weighted training error metric (ETEM) together with generalization capability metric (GCM), rather than training error rate only. Second, after getting the coefficients that are set empirically, we combine the weak learners optimally by tuning the coefficients using kernel-based perceptron. Experiments with synthetic and real scene data sets show our algorithm outperforms conventional AdaBoost.

Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
, , ,