| Article ID | Journal | Published Year | Pages | File Type |
|---|---|---|---|---|
| 9653361 | Neurocomputing | 2005 | 20 Pages |
Abstract
In this paper, the problem of simultaneously approximating a function and its derivatives is formulated within the Support Vector Machine (SVM) framework. First, the problem is solved for a one-dimensional input space by using the ε-insensitive loss function and introducing additional constraints in the approximation of the derivative. Then, we extend the method to multi-dimensional input spaces by a multidimensional regression algorithm. In both cases, to optimize the regression estimation problem, we have derived an iterative re-weighted least squares (IRWLS) procedure that works fast for moderate-size problems. The proposed method shows that using the information about derivatives significantly improves the reconstruction of the function.
Keywords
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Marcelino Lázaro, Ignacio SantamarÃa, Fernando Pérez-Cruz, Antonio Artés-RodrÃguez,
