Article ID Journal Published Year Pages File Type
409532 Neurocomputing 2006 7 Pages PDF
Abstract

Real Adaboost is a well-known and good performance boosting method used to build machine ensembles for classification. Considering that its emphasis function can be decomposed in two factors that pay separated attention to sample errors and to their proximity to the classification border, a generalized emphasis function that combines both components by means of a selectable parameter, λλ, is presented. Experiments show that simple methods of selecting λλ frequently offer better performance and smaller ensembles.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , ,