Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
6874192 | Information Processing Letters | 2018 | 8 Pages |
Abstract
In least square support vector regression (LSSVR), Vapnik's original SVR formulation has been modified by using a cost function which corresponds to a form of ridge regression rather than ε-insensitive loss function. As a result, nonlinear function estimation is done by solving linear set of equations instead of solving a time-consuming quadratic programming problem. When the gradient/Hessians in samples can be obtained cheaply, it should be considered in the construction of metamodels. In this paper, the gradient/Hessian-enhanced LSSVR (G/HELSSVR) is developed through incorporating gradient/Hessian information into the traditional LSSVR. The performance of this method is tested by analytical function fitting. The experimental results illustrate that the proposed G/HELSSVR model has a great advantages over the traditional LSSVR and gradient-enhanced LSSVR (GELSSVR).
Related Topics
Physical Sciences and Engineering
Computer Science
Computational Theory and Mathematics
Authors
Ting Jiang, XiaoJian Zhou,