Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
1145579 | Journal of Multivariate Analysis | 2014 | 13 Pages |
Abstract
The inverse of normal covariance matrix is called precision matrix and often plays an important role in statistical estimation problem. This paper deals with the problem of estimating the precision matrix under a quadratic loss, where the precision matrix is restricted to a bounded parameter space. Gauss' divergence theorem with matrix argument shows that the unbiased and unrestricted estimator is dominated by a posterior mean associated with a flat prior on the bounded parameter space. Also, an improving method is given by considering an expansion estimator. A hierarchical prior is shown to improve on the posterior mean. An application is given for a Bayesian prediction in a random-effects model.
Keywords
Related Topics
Physical Sciences and Engineering
Mathematics
Numerical Analysis
Authors
Hisayuki Tsukuma,