Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
4948457 | Neurocomputing | 2016 | 13 Pages |
Abstract
Forecasting by regression is a very important method for the prediction of continuous values. Generally, in order to increase the predictive accuracy and reliability, as many factors or features as possible are considered and added to the regression model, however, this leads to the poor efficiency, accuracy, and interpretability. Besides, some existing methods associated with support vector regression (SVR) usually require us to solve the convex quadratic programming problem with a higher computational complexity. In this paper, we proposed a novel two-phase multi-kernel SVR using linear programming method (MK-LP-SVR) for feature sparsification and forecasting so as to solve the aforementioned problems. The multi-kernel learning method is mainly utilized to carry out feature sparsification and find the important features by computing their contribution to forecasting while the whole model can be used to predict output values for given inputs. Based on a simulation, 6 small, and 6 big data sets, the experimental results and comparison with SVR, linear programming SVR (LP-SVR), least squares SVR (LS-SVR), and multiple kernel learning SVR (MKL-SVR) showed that our proposed model has considerably improved predictive accuracy and interpretability for the regression forecasting on the independent test sets.
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Zhiwang Zhang, Guangxia Gao, Yingjie Tian, Jue Yue,