Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
1149879 | Journal of Statistical Planning and Inference | 2008 | 10 Pages |
Abstract
We discuss the general form of a first-order correction to the maximum likelihood estimator which is expressed in terms of the gradient of a function, which could for example be the logarithm of a prior density function. In terms of Kullback-Leibler divergence, the correction gives an asymptotic improvement over maximum likelihood under rather general conditions. The theory is illustrated for Bayes estimators with conjugate priors. The optimal choice of hyper-parameter to improve the maximum likelihood estimator is discussed. The results based on Kullback-Leibler risk are extended to a wide class of risk functions.
Keywords
Related Topics
Physical Sciences and Engineering
Mathematics
Applied Mathematics
Authors
Shinto Eguchi, Takemi Yanagimoto,