Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
6863640 | Neurocomputing | 2018 | 16 Pages |
Abstract
Ensemble regression method shows better performance than single regression since ensemble regression method can combine several single regression methods together to improve accuracy and stability of a single regressor. In this paper, we propose a novel kernel ensemble regression method by minimizing total least square loss in multiple Reproducing Kernel Hilbert Spaces (RKHSs). Base kernel regressors are co-optimized and weighted to form an ensemble regressor. In this way, the problem of finding suitable kernel types and their parameters in base kernel regressor is solved in the ensemble regression framework. Experimental results on several datasets, such as artificial datasets, UCI regression and classification datasets, show that our proposed approach achieves the lowest regression loss among comparative regression methods such as ridge regression, support vector regression (SVR), gradient boosting, decision tree regression and random forest.
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Xiang-Jun Shen, Yong Dong, Jian-Ping Gou, Yong-Zhao Zhan, Jianping Fan,