Article ID Journal Published Year Pages File Type
383091 Expert Systems with Applications 2014 12 Pages PDF
Abstract

•A Decision Tree method based on imprecise probabilities is analyzed.•A new model is presented where all input variables are processed with imprecision.•Experiments on data sets with different levels of general noise are carried out.•The new method obtains smaller trees and better results than the original method.•The new method outperforms the classic ones on data set with general noise.

An analysis of a procedure to build decision trees based on imprecise probabilities and uncertainty measures, called CDT, is presented. We compare this procedure with the classic ones based on the Shannon’s entropy for precise probabilities. We found that the handling of the imprecision is a key part of obtaining improvements in the method’s performance, as it has been showed for class noise problems in classification. We present a new procedure for building decision trees extending the imprecision in the CDT’s procedure for processing all the input variables. We show, via an experimental study on data set with general noise (noise in all the input variables), that this new procedure builds smaller trees and gives better results than the original CDT and the classic decision trees.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, ,