Article ID Journal Published Year Pages File Type
532142 Information Fusion 2012 11 Pages PDF
Abstract

This paper proposes a method for constructing ensembles of decision trees, random feature weights (RFWRFW). The method is similar to Random Forest, they are methods that introduce randomness in the construction method of the decision trees. In Random Forest only a random subset of attributes are considered for each node, but RFWRFW considers all of them. The source of randomness is a weight associated with each attribute. All the nodes in a tree use the same set of random weights but different from the set of weights in other trees. So, the importance given to the attributes will be different in each tree and that will differentiate their construction. The method is compared to Bagging, Random Forest, Random-Subspaces, AdaBoost and MultiBoost, obtaining favourable results for the proposed method, especially when using noisy data sets. RFWRFW can be combined with these methods. Generally, the combination of RFWRFW with other method produces better results than the combined methods. Kappa-error diagrams and Kappa-error movement diagrams are used to analyse the relationship between the accuracies of the base classifiers and their diversity.

Research highlights► Random weights are assigned to attributes and used in the construction of the tree. ► Different weights for different trees, but the same weights for all nodes of a tree. ► Better results than Bagging, Random Forest, Random Subspaces and Boosting. ► Robust to class noise. ► It can be combined with other ensemble methods, generally the results improve.

Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
, , , ,