Article ID Journal Published Year Pages File Type
1144546 Journal of the Korean Statistical Society 2015 16 Pages PDF
Abstract

A shrinkage-type variable selection procedure for varying coefficient models is routinely established in the least-squares (LS) framework. Although the LS method has favorable properties for a large class of error distributions, it will break down if the error variance is infinite and is adversely affected by outliers and heavy-tail distributions. To overcome these issues, we propose a robust shrinkage method termed regularized Walsh-average (RWA) that can construct robust nonparametric variable selection and robust coefficient estimation simultaneously. Theoretical analysis reveals RWA works beautifully, including consistency in variable selection and oracle property in estimation, even when error variance is infinite. More important property is that when error variance is finite, compared with the LS based estimators, the asymptotic relative efficiency of the new estimator is at least 0.8896, a relatively high level. Furthermore, a robust BIC-type criterion, which can identify the true model consistently, is suggested for shrinkage parameter selection. Numerical studies also confirm our theories.

Related Topics
Physical Sciences and Engineering Mathematics Statistics and Probability
Authors
, ,