Article ID Journal Published Year Pages File Type
531807 Pattern Recognition 2016 7 Pages PDF
Abstract

•Propose the semi-supervised split criterion for decision trees.•Combine the semi-supervised decision tree as subspace partitioning with other classifiers.•Experiments on several datasets showed that the proposed method outperforms the existing ones.

Among data mining techniques, the decision tree is one of the more widely used methods for building classification models in the real world because of its simplicity and ease of interpretation. However, the method has some drawbacks, including instability, the nonsmooth nature of the decision boundary, and the possibility of overfitting. To overcome these problems, several works have utilized the relative advantages of other classifiers, such as logistic regression, support vector machine, and neural networks, in combination with a decision tree, in hybrid models which avoid the drawbacks of other models. Some hybrid models have used decision trees to quickly and efficiently partition the input space, and many studies have proved the effectiveness of the hybrid methods. However, there is room for further improvement by considering the topological properties of a dataset, because typical decision trees split nodes based only on the target variable. The proposed semi-supervised decision tree splits internal nodes by utilizing both labels and the structural characteristics of data for subspace partitioning, to improve the accuracy of classifiers applied to terminal nodes in the hybrid models. Experimental results confirm the superiority of the proposed algorithm and demonstrate the detailed characteristics of the algorithm.

Keywords
Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
,