Article ID Journal Published Year Pages File Type
6940189 Pattern Recognition Letters 2018 10 Pages PDF
Abstract
Multi-output least-squares support vector regression machines (MLS-SVR) is proposed by Xu et al. [29] to handle multi-output regression problems. However, the prohibitive cost of model selection severely hinders MLS-SVR's application. In this paper, an efficient gradient-based model selection algorithm for MLS-SVR is proposed. Firstly, a new training algorithm for MLS-SVR is developed, which allows one to obtain the solution vector for each output independently by dealing with matrices of much lower order. Based on the new training algorithm, a new leave-one-out error estimate is derived through virtual leave-one-out cross-validation. The model selection criterion is based on the new leave-one-out error estimate and its derivatives with respect to the hyper-parameters are also derived analytically. Both the model selection criterion and its partial derivatives can be obtained straightway once a training process ended. Finally, the hyper-parameters corresponding to the lowest model selection criterion is obtained through gradient decent method. The effectiveness and generalization performance of the proposed algorithm are validated through experiments on several multi-output datasets. Experiment results show that the proposed algorithm can save computational time dramatically without losing accuracy.
Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
, ,