Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
408594 | Neurocomputing | 2007 | 15 Pages |
Abstract
We present an extension to unsupervised kernel regression (UKR), a recent method for learning of nonlinear manifolds, which can utilize leave-one-out cross-validation as an automatic complexity control without additional computational cost. Our extension allows us to incorporate general cost functions, by which the UKR algorithm can be made more robust or be tuned to specific noise models. We focus on Huber's loss and on the εε-insensitive loss, which we present together with a practical optimization approach. We demonstrate our method on both toy and real data.
Keywords
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Stefan Klanke, Helge Ritter,