کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
1147953 957810 2012 12 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Decomposition of Kullback–Leibler risk and unbiasedness for parameter-free estimators
موضوعات مرتبط
مهندسی و علوم پایه ریاضیات ریاضیات کاربردی
پیش نمایش صفحه اول مقاله
Decomposition of Kullback–Leibler risk and unbiasedness for parameter-free estimators
چکیده انگلیسی

The bias and variance of traditional parameter estimators are parameter-dependent quantities. The maximum likelihood estimate (MLE) can be defined directly on a family of distributions PP and so is parameter-free. The parameter-invariance   property of the MLE can be described by the fact that the MLE for the original parameter and the MLE for any reparametrization name the same distribution. We define parameter-free estimators to be P-valuedP-valued random variables rather than parameter-valued random variables. The Kullback–Leibler (KL) risk is decomposed into two parameter-free quantities that describe the variance and squared bias of the estimator. We show that for exponential families the P-valuedP-valued MLE is unbiased. We define the KL mean K   of a P-valuedP-valued random variable and show how K   describes the long-run properties of this random distribution. For most families PP, the KL mean of any P-valuedP-valued random variable will not lie in PP so that we define another mean M, called the distribution mean that is related to K   and is an element of PP. By allowing the distribution estimator to take values outside of PP, the KL mean can be made to lie in PP. We compare the MLE to non-P-valuednon-P-valued estimators that have been suggested for the Hardy–Weinberg model. Results for the dual KL risk are also given.

ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Journal of Statistical Planning and Inference - Volume 142, Issue 6, June 2012, Pages 1525–1536
نویسندگان
, ,