Article ID Journal Published Year Pages File Type
1150618 Journal of Statistical Planning and Inference 2007 15 Pages PDF
Abstract

In this paper, we consider the prediction problem in multiple linear regression model in which the number of predictor variables, p, is extremely large compared to the number of available observations, n  . The least-squares predictor based on a generalized inverse is not efficient. We propose six empirical Bayes estimators of the regression parameters. Three of them are shown to have uniformly lower prediction error than the least-squares predictors when the vector of regressor variables are assumed to be random with mean vector zero and the covariance matrix (1/n)XtX(1/n)XtX where Xt=(x1,…,xn)Xt=(x1,…,xn) is the p×np×n matrix of observations on the regressor vector centered from their sample means. For other estimators, we use simulation to show its superiority over the least-squares predictor.

Related Topics
Physical Sciences and Engineering Mathematics Applied Mathematics
Authors
, ,