Article ID Journal Published Year Pages File Type
528259 Information Fusion 2013 8 Pages PDF
Abstract

In this paper, we present an experimental comparison among different strategies for combining decision trees built by means of imprecise probabilities and uncertainty measures. It has been proven that the combination or fusion of the information obtained from several classifiers can improve the final process of the classification. We use previously developed schemes, known as Bagging and Boosting, along with a new one based on the variation of the root node via the information rank of each feature of the class variable. To this end, we applied two different approaches to deal with missing data and continuous variables. We use a set of tests on the performance of the methods analyzed here, to show that, with the appropriate approach, the Boosting scheme constitutes an excellent way to combine this type of decision tree. It should be noted that it provides good results, even compared with a standard Random Forest classifier, a successful procedure very commonly used in the literature.

Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
,