Article ID Journal Published Year Pages File Type
4605084 Applied and Computational Harmonic Analysis 2014 16 Pages PDF
Abstract

Random projection allows one to substantially reduce dimensionality of data while still retaining a significant degree of problem structure. In the past few years it has received considerable interest in compressed sensing and learning theory. By using the random projection of the data to low-dimensional space instead of the data themselves, a learning algorithm is implemented with low computational complexity. This paper investigates the accuracy of the algorithm of regularized empirical risk minimization in Hilbert spaces. By letting the dimensionality of the projected data increase suitably as the number of samples increases, we obtain an estimation of the error for least squares regression and support vector machines.

Related Topics
Physical Sciences and Engineering Mathematics Analysis
Authors
, ,