Article ID Journal Published Year Pages File Type
9653610 Neurocomputing 2005 31 Pages PDF
Abstract
Kernel based methods suffer from exceeding time and memory requirements when applied on large datasets since the involved optimization problems typically scale polynomially in the number of data samples. As a remedy, some least squares methods on one hand only reduce the number of parameters (for fast training), on the other hand only work on a reduced set (for fast evaluation). Departing from the Nyström based feature approximation, via the fixed-size LS-SVM model, we propose a general regression framework, based on restriction of the search space to a subspace and a particular choice of basis vectors in feature space. In the general model both reduction aspects are unified and become explicit model choices. This allows to accommodate kernel Partial Least Squares and kernel Canonical Correlation analysis for regression with a sparse representation, which makes them applicable to large data sets, with little loss in accuracy.
Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , ,