Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
11008009 | Neurocomputing | 2018 | 33 Pages |
Abstract
Label distribution learning (LDL) has proven effective in many machine learning applications. Previous LDL methods have focused on learning a non-linear conditional probability mass function by maximizing entropy or minimizing the Kullback-Leibler (K-L) divergence. In order to make full use of the structural information among different classes, a method called structured random forest (StructRF) regression is used which has been applied to semantic image labeling and edge detection. It is a general LDL model that treats the distribution as an integral whole. In StructRF, all label distributions are mapped to a discrete space at each split node in a random forest. In this way, standard information gain measures can be evaluated. Then the predicted distribution can be obtained directly without calculating the probability of each class individually during the test. StructRF is proved to be fast in training and it reaches higher accuracies and lower standard deviations among different measurements. Besides, we propose an adaptive variable step method that can speed up the training process and reduce the calculations of information gain significantly. It is suitable for the most decision tree based models.
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Mengting Chen, Xinggang Wang, Bin Feng, Wenyu Liu,