کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
1149205 | 957867 | 2010 | 16 صفحه PDF | دانلود رایگان |

This paper extends the concept of risk unbiasedness for applying to statistical prediction and nonstandard inference problems, by formalizing the idea that a risk unbiased predictor should be at least as close to the “true” predictant as to any “wrong” predictant, on the average. A novel aspect of our approach is measuring closeness between a predicted value and the predictant by a regret function, derived suitably from the given loss function. The general concept is more relevant than mean unbiasedness, especially for asymmetric loss functions. For squared error loss, we present a method for deriving best (minimum risk) risk unbiased predictors when the regression function is linear in a function of the parameters. We derive a Rao–Blackwell type result for a class of loss functions that includes squared error and LINEX losses as special cases. For location-scale families, we prove that if a unique best risk unbiased predictor exists, then it is equivariant. The concepts and results are illustrated with several examples. One interesting finding is that in some problems a best unbiased predictor does not exist, but a best risk unbiased predictor can be obtained. Thus, risk unbiasedness can be a useful tool for selecting a predictor.
Journal: Journal of Statistical Planning and Inference - Volume 140, Issue 7, July 2010, Pages 1923–1938