Article ID Journal Published Year Pages File Type
1149205 Journal of Statistical Planning and Inference 2010 16 Pages PDF
Abstract

This paper extends the concept of risk unbiasedness for applying to statistical prediction and nonstandard inference problems, by formalizing the idea that a risk unbiased predictor should be at least as close to the “true” predictant as to any “wrong” predictant, on the average. A novel aspect of our approach is measuring closeness between a predicted value and the predictant by a regret function, derived suitably from the given loss function. The general concept is more relevant than mean unbiasedness, especially for asymmetric loss functions. For squared error loss, we present a method for deriving best (minimum risk) risk unbiased predictors when the regression function is linear in a function of the parameters. We derive a Rao–Blackwell type result for a class of loss functions that includes squared error and LINEX losses as special cases. For location-scale families, we prove that if a unique best risk unbiased predictor exists, then it is equivariant. The concepts and results are illustrated with several examples. One interesting finding is that in some problems a best unbiased predictor does not exist, but a best risk unbiased predictor can be obtained. Thus, risk unbiasedness can be a useful tool for selecting a predictor.

Related Topics
Physical Sciences and Engineering Mathematics Applied Mathematics
Authors
, ,