Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
9501308 | Journal of Complexity | 2005 | 18 Pages |
Abstract
Learning from data with generalization capability is studied in the framework of minimization of regularized empirical error functionals over nested families of hypothesis sets with increasing model complexity. For Tikhonov's regularization with kernel stabilizers, minimization over restricted hypothesis sets containing for a fixed integer n only linear combinations of all n-tuples of kernel functions is investigated. Upper bounds are derived on the rate of convergence of suboptimal solutions from such sets to the optimal solution achievable without restrictions on model complexity. The bounds are of the form 1/n multiplied by a term that depends on the size of the sample of empirical data, the vector of output data, the Gram matrix of the kernel with respect to the input data, and the regularization parameter.
Related Topics
Physical Sciences and Engineering
Mathematics
Analysis
Authors
VÄra Kůrková, Marcello Sanguineti,