Article ID Journal Published Year Pages File Type
1145225 Journal of Multivariate Analysis 2016 13 Pages PDF
Abstract
We examine the rate of convergence of the Lasso estimator of lower dimensional components of the high-dimensional parameter. Under bounds on the ℓ1-norm on the worst possible sub-direction these rates are of order |J|logp/n where p is the total number of parameters, n is the number of observations and J⊂{1,…,p} represents a subset of the parameters. We also derive rates in sup-norm in terms of the rate of convergence in ℓ1-norm. The irrepresentable condition on a set J requires that the ℓ1-norm of the worst possible sub-direction is sufficiently smaller than one. In that case sharp oracle results can be obtained. Moreover, if the coefficients in J are small enough the Lasso will put these coefficients to zero. By de-sparsifying one obtains fast rates in supremum norm without conditions on the worst possible sub-direction. The results are extended to M-estimation with ℓ1-penalty for generalized linear models and exponential families. For the graphical Lasso this leads to an extension of known results to the case where the precision matrix is only approximately sparse. The bounds we provide are non-asymptotic but we also present asymptotic formulations for ease of interpretation.
Related Topics
Physical Sciences and Engineering Mathematics Numerical Analysis
Authors
,