Article ID Journal Published Year Pages File Type
472733 Computers & Mathematics with Applications 2011 10 Pages PDF
Abstract

In this paper, we consider the coefficient-based regularized least-squares regression problem with the lqlq-regularizer (1≤q≤2)(1≤q≤2) and data dependent hypothesis spaces. Algorithms in data dependent hypothesis spaces perform well with the property of flexibility. We conduct a unified error analysis by a stepping stone technique. An empirical covering number technique is also employed in our study to improve sample error. Comparing with existing results, we make a few improvements: First, we obtain a significantly sharper learning rate that can be arbitrarily close to O(m−1)O(m−1) under reasonable conditions, which is regarded as the best learning rate in learning theory. Second, our results cover the case q=1q=1, which is novel. Finally, our results hold under very general conditions.

Keywords
Related Topics
Physical Sciences and Engineering Computer Science Computer Science (General)
Authors
, ,