کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
534579 | 870267 | 2013 | 11 صفحه PDF | دانلود رایگان |
This paper presents a coarse-to-fine learning algorithm for multiclass problems. The algorithm is applied to ensemble-based learning by using boosting to construct cascades of classifiers. The goal is to address the training and detection runtime complexities found in an increasing number of classification domains. This research applies a separate-and-conquer strategy with respect to class labels, in order to realize efficiency in both the training and detection phases under limited computational resources, without compromising accuracy. The paper demonstrates how popular, non-cascaded algorithms like AdaBoost.M2, AdaBoost.OC and AdaBoost.ECC can be converted into robust cascaded classifiers. Additionally, a new multiclass weak learner is proposed that is custom designed for cascaded training. Experiments were conducted on 18 publicly available datasets and showed that the cascaded algorithms achieved considerable speed-ups over the original AdaBoost.M2, AdaBoost.OC and AdaBoost.ECC in both training and detection runtimes. The cascaded classifiers did not exhibit significant compromises in their generalization ability and in fact produced evidence of improved accuracies on datasets with biased-class distributions.
► We study coarse-to-fine ensemble-based approaches for multiclass classification.
► We show how existing multiclass algorithms can be extended to train cascaded classifiers.
► An additional multiclass weak-learner is proposed for our multiclass cascading algorithm.
► Results demonstrate that our cascaded algorithm accelerates training and detection runtimes.
► Accelerations are achieved, while classifier accuracy is preserved.
Journal: Pattern Recognition Letters - Volume 34, Issue 8, 1 June 2013, Pages 884–894